I've been speaking and writing lately about how tech companies try to shift the discussion on misinformation, polarization, harassment, etc., away from the systems and structures that are inherently toxic and toward questions of individual behavior.
This way they can blame their own users for any pathology and steer clear of calls for systemic change.
Today Elon Musk has come through with a perfect illustration for my future talks.
This should be obvious, but having an algorithm that behaves that way is a DELIBERATE CHOICE.
It would be easy enough, for example, to implement basic sentiment analysis so that the algorithm doesn't boost content that you have reacted negatively to in the future.
Musk is playing it both ways. He keeps the algorithm that boosts inflammatory content and drives the online conflicts that draw views and clicks, while pushing the blame for this off onto the individuals involved.
That sucks.