Aug. 11th, 2018

For a variety of reasons, I'm going to start moving away from Twitter in earnest over the next little bit. I don't quite know where I'll wind up next (other than on Dreamwidth, of course), but I'll leave a forwarding address before I go.

I've felt for a while that that Twitter is not structured to promote healthy discourse. Twitter management has made decision after decision after decision after decision focused only on maximizing revenue^Wuser engagement, to the detriment of those of us who prefer to see content that is high-quality, thoughtful and insightful (not inciteful).

I can deal with the noise—I've turned off Retweets entirely, and I use a third-party client, so I'm insulated from most of their poor UI decisions. But in recent months, I’ve heard one too many stories of inaction in the face of bigotry and hate, and as a result, I've lost confidence in Twitter's ability to manage the community effectively. That, for me, was the final straw.

As part of a larger thread, @Jack said [emphasis added]:

Truth is we’ve been terrible at explaining our decisions in the past. We’re fixing that. We’re going to hold Jones to the same standard we hold to every account, not taking one-off actions to make us feel good in the short term, and adding fuel to new conspiracy theories.

I agree Twitter needs to get better at explaining their decisions, and I agree it's important to be consistent in how you apply the rules. But the rules themselves need to reflect the reality that accounts with a larger audience and more influence have a correspondingly larger impact when they misbehave. Words and actions need to be judged not only by their intentions, but by the type, breadth and depth of their impact. Higher-profile accounts need to be held to a higher standard, and Twitter has consistently failed to do that.

When Twitter allows public-figure bigots like Alex Jones and Donald Trump—both of whom regularly engage in hateful and/or violent speech—to remain active on the platform while suspending relatively obscure accounts for comparatively minor infractions, it sends the message that Twitter supports racism and bigotry. When Twitter bans Richard Spencer—a man who publicly advocates for genocide—and then unbans him a month later (supposedly because he was "creating []multiple accounts with overlapping use"), while at the same time suspending anyone who tweets, "punch a Nazi" for advocating violence, bigots and hate-mongers know they will look the other way.

The end result: Twitter has created lots of opportunity for the followers of neo-Nazis and alt-right leaders to follow their example, and in doing so, has sent the implicit message that what they are saying and doing is okay.

It's not okay. In fact, it's not acceptable. And while I understand that Twitter needs to abide by their policies as written, they should have done a lot more, a lot quicker, to ensure their policies actually have the outcomes they claim to desire.

They've had months, if not years, to get this right. Based on what I've seen this week from Twitter HQ, I have no confidence they'll figure it out anytime soon, and I'm not willing to wait any longer.

— Des

Profile

deskitty: Angry pouncy siamese cat head (Default)
Des

August 2018

S M T W T F S
   1234
5678910 11
12131415161718
19202122232425
262728293031 

Most Popular Tags