@rodhilton exactly. this is the point i think everyone’s been missing. so far all the narratives have been “look how much more content a single person can produce!” but in my mind it’s just spitting out garbage faster. just more to tune out IMO.
Top-level
@rodhilton exactly. this is the point i think everyone’s been missing. so far all the narratives have been “look how much more content a single person can produce!” but in my mind it’s just spitting out garbage faster. just more to tune out IMO. 5 comments
@ben @rodhilton it will self-amplify as training sets begin to be overwhelmingly generated content, too @ben @rodhilton It's a crisis of overproduction, and we know what happens next (it's not even close to what Marx predicted): https://worldofwonders.substack.com/p/large-language-models-and-the-overproduction |
@ben @rodhilton I'm seeing this occur on #Amazon as well, where some niche books are obviously being augmented with generated text. The low quality wastes the readers time and money. I've stopped buying newly released, unreviewed books because of it