Someone recently suggested to me that AI systems bring the users' ability closer to the average. I was intrigued by this idea because it reflects my experience. I am, for example, terrible at any kind of visual art, but with something like Stable Diffusion I can produce things that are merely quite bad, whereas without it I can produce things that are absolutely terrible. Conversely, with GitHub Copilot I can write code with more bugs that's harder to read. Watching non-programmers use it and ChatGPT with Python, they can produce fairly mediocre code that mostly works.
I suppose it shouldn't surprise anyone that a machine that's trained to produce output from a statistical model built from a load of examples would tend towards the mean.
An unflattering interpretation of this would suggest that the people who are most excited by AI in any given field are the people with the least talent in that field.
@david_chisnall convergence to mean in multiple dimensions. Take any model and use the domain specific version of ye olde drawing exercise of shading by using your thumb to substitute high frequency detail for shading and bet the bank that those details didn't matter.
What this fails to capture is depriving both participants of the 'jam session'; turning people into unattributed autocompletes makes you forget the person and deprives them of all of the interaction, whether that would've led to collaboration or mentoring.
@david_chisnall convergence to mean in multiple dimensions. Take any model and use the domain specific version of ye olde drawing exercise of shading by using your thumb to substitute high frequency detail for shading and bet the bank that those details didn't matter.
What this fails to capture is depriving both participants of the 'jam session'; turning people into unattributed autocompletes makes you forget the person and deprives them of all of the interaction, whether that would've led to collaboration...