I've been speaking and writing lately about how tech companies try to shift the discussion on misinformation, polarization, harassment, etc., away from the systems and structures that are inherently toxic and toward questions of individual behavior.

This way they can blame their own users for any pathology and steer clear of calls for systemic change.

Today Elon Musk has come through with a perfect illustration for my future talks.

This should be obvious, but having an algorithm that behaves that way is a DELIBERATE CHOICE.

It would be easy enough, for example, to implement basic sentiment analysis so that the algorithm doesn't boost content that you have reacted negatively to in the future.

Musk is playing it both ways. He keeps the algorithm that boosts inflammatory content and drives the online conflicts that draw views and clicks, while pushing the blame for this off onto the individuals involved.

That sucks.

Greg Gentry

@Greggentry@esq.social

"He keeps the algorithm..."

Perhaps it's other changes (reinstating formerly banned accounts) but Twitter feels like it has amplified this algorithmic choice.

January 17, 2023 at 5:08:45 PM

@Greggentry @ct_bergstrom Yes. Instead of seeing accounts I follow, I’m offered up accounts those accounts follow. If I’m following legal analysts who follow domestic extremists, I want the analyses, not the hot takes from the extremists, in my feed.

Elk Logo

Welcome to Elk!

Elk is a nimble Mastodon web client. You can login to your Mastodon account and use it to interact with the fediverse.

Expect some bugs and missing features here and there. Elk is Open Source and we're actively improving it as a community project. Join us and let's build it together!

If you'd like to report a bug, help us testing, give feedback, or contribute, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

Anthony Fu三咲智子 Kevin DengJoaquín SánchezTAKAHASHI ShuujiPatakDaniel Roe

The Elk Team