Email or username:

Password:

Forgot your password?
Carl T. Bergstrom

I've been speaking and writing lately about how tech companies try to shift the discussion on misinformation, polarization, harassment, etc., away from the systems and structures that are inherently toxic and toward questions of individual behavior.

This way they can blame their own users for any pathology and steer clear of calls for systemic change.

Today Elon Musk has come through with a perfect illustration for my future talks.

#socialmedia #twitter #ElonMusk

11 comments
Carl T. Bergstrom

This should be obvious, but having an algorithm that behaves that way is a DELIBERATE CHOICE.

It would be easy enough, for example, to implement basic sentiment analysis so that the algorithm doesn't boost content that you have reacted negatively to in the future.

Musk is playing it both ways. He keeps the algorithm that boosts inflammatory content and drives the online conflicts that draw views and clicks, while pushing the blame for this off onto the individuals involved.

That sucks.

Carl T. Bergstrom

I've posted a slightly longer version of this thread to post.news. For those who are interested, here's the link. I've also attached the text below.

post.news/article/2KTDq7DXiMkp

FirefighterGeek :masto:

@ct_bergstrom Does this surprise anyone? To the algo, "Enragement" == "Engagement"

John

@ct_bergstrom

Elon Musk is following the playbook of many corporations, shift responsibility to consumers and individuals who have the least information or knowledge.

Social media companies such Meta and YouTube do this, even though their algorithms curate feeds of users.

Fossil fuel companies do this on climate change, while they lobby law makers and push false messages

Firearm companies do this while they ensure they are protected from basic consumer protection law.

Cigarette companies did this.

Sadly many libertarian groups have latched onto this lie

@ct_bergstrom

Elon Musk is following the playbook of many corporations, shift responsibility to consumers and individuals who have the least information or knowledge.

Social media companies such Meta and YouTube do this, even though their algorithms curate feeds of users.

Fossil fuel companies do this on climate change, while they lobby law makers and push false messages

The Dark Tower

@jw π‘Ύπ’‰π’š when they π’‡π’Šπ’π’‚π’π’π’š began holding tobacco companies liable were they 𝒏𝒐𝒕 told to remove the chemical additives that make them more addictive & dangerous?
Unproccessed tobacco is 𝒏𝒐𝒕 addictive. π‘΅π’π’•π’‰π’Šπ’π’ˆ found in nature is. Not until humans screw with them.

HawthornFire πŸ³οΈβ€βš§οΈ

@ct_bergstrom I commented on post, but will repeat here - it seems like deliberately feeding white supremacist posts to Black Twitter users who have reported racist posts should be legally actionable. Algorithms are programmed by humans with human intentions.

shadownlite

@ct_bergstrom probably why I do not see any of this. I weeded through my followers and who I followed in 2018. Got rid of those prone to posting anger inducing posts and retweeting crap that gets people upset...even if they were just retweeting to make people "aware."

Then focused on art and music follows...clicking on their posts and sharing if I really liked the content.

It makes for a better Twitter experience for me.

Gail Parenti

@ct_bergstrom So sticking around to call out the bullshit results in getting more bullshit directed to you, do I have that right? Seems like a great plan.

TuxPhones.com

@ct_bergstrom
That " :unverified: βœ…" alone is an UX eyesore

Go Up