@simon @asb
My guess is that the "reduction in quality" is really people becoming more critical of the outputs. In the beginning, LLMs seem magical, and one is willing to overlook many flaws. Over time one becomes less astonished by the good outputs (yes, it can create a sonnet about a duck who is a pirate), and more critical of the flaws (the output doesn't fully match the formal definition of a sonnet).