Elon Musk’s Quest to Make Twitter Worse

Ralph SpencerOmar Wasow Sarah Roberts

Elon Musk’s presence has loomed over Twitter since he announced plans to purchase the platform. And for these few weeks that he’s been in charge, many concerns have proven to be justified. Musk laid off 3,700 employees, and then 4,400 contractors. He is firing those who are critical of him. The verification process, perhaps one of Twitter’s most trusted features, has been unraveled. He’s offered severance to those who don’t want to be part of “extremely hardcore” Twitter. Following the results of a Twitter poll, he reinstated the account of Donald Trump, who was suspended from the platform for his role in inciting the January 6th attacks.

So, what happens now? What of the many social movements that manifested on Twitter? While some movements and followings may see new manifestations on other platforms, not everything will be completely recreated. For example, as writer Jason Parham explains, “whatever the destination, Black Twitter will be increasingly difficult to recreate.”

In this episode of Community Signal, Patrick speaks to three experts: Sarah T. Roberts, associate professor in the Department of Information Studies at UCLA, trust and safety consultant Ralph Spencer, and Omar Wasow, assistant professor in UC Berkeley’s Department of Political Science and co-founder of BlackPlanet, about the current state and future of Twitter. They dissect the realities facing the platform today including content moderation, loss of institutional knowledge, and uncertainty about Twitter’s infrastructure, but also emphasize the importance of Twitter as a social utility for news and more.

This episode also touches on:

  • The reality of moderating a platform like Twitter
  • What platforms actually mean when they say they’re for “free speech”
  • How Musk tanked the value of verification on Twitter

Big Quotes

On the future of content moderation at Twitter (8:28): “There’s no way possible with the cuts [Musk has] made that he’s going to be able to do any type of content moderation. … [He] isn’t going to have anybody who remotely begins to know to how to do that [legal compliance and related work].” –Ralph Spencer

Sarah T. Roberts’ moderation challenge for Elon Musk (11:19): “I want Elon Musk to spend one day as a frontline production content moderator, and then get back to this [Community Signal] crew about how that went. Let us know what you saw. Share with us how easy it was to stomach that. Were you able to keep up with the expected pace at Twitter? Could you … make good decisions over 90% of the time, over 1,000, 2,000 times a day? Could you do that all the while seeing animals being harmed, kids being beat on, [and] child sexual exploitation material?” –@ubiquity75

Bumper sticker wisdom doesn’t make good policy (15:46): “Everything [Musk has said about free speech] has had the quality of good bumper stickers but is totally divorced from reality, and that doesn’t bode well, obviously.” –@owasow

The responsibility in leading a social media platform (19:41): “One thing that we are seeing in real-time [at Twitter] is what a danger there is in having one individual – especially a very privileged individual who does not live in the same social milieu as almost anyone else in the world – one very privileged individual’s ability to be the arbiter of … these profoundly contested ideological notions of something like free speech which again is continually misapplied in this realm.” –@ubiquity75

Musk’s peddling of conspiracy theories (20:29): “[Musk is] running around tweeting that story about Nancy Pelosi’s husband, the false article about what happened between him and his attacker. What kind of example is that to set? … What it is to me is like this kid who has way too much money, and he found a new toy he wants to play with.” –Ralph Spencer

Leading with humility (21:23): “[If you’re running a site like Twitter,] you have to have a ‘small d’ democratic personality, which is to say you really have to be comfortable with a thousand voices flourishing, a lot of them being critical of you, and that’s not something that you take personally.” –@owasow

There are always limits on speech (23:50): “When you declare that your product, your site, your platform, your service is a free speech zone, there is always going to be a limit on that speech. … [CSAM] is the most extreme example that we can come up with, but that is content moderation. To remove that material, to disallow it, to enforce the law means that there is a limit on speech, and there ought to be in that case. If there’s a limit on speech, it is by definition not a free speech site. Then we have to ask, well, what are the limits, and who do they serve?” –@ubiquity75

“Free speech” platforms are not a thing (25:25): “When I hear people invoke free speech on a for-profit social media site, not only does that not exist today, it never has existed, and it never will exist. Let’s deal with what reality is actually giving us and talk about that instead of these fantasies that actually are pretty much not good for anyone involved.” –@ubiquity75

The social weight and trust that verification brought to interactions on Twitter (32:52): “[Twitter] has outsized social impact, whether it’s in the political arena, whether it’s in social movements, whether it’s in celebrity usage, all of these things have been true. In terms of political movements, the good, bad, the ugly. We saw an insurrection against the United States launched by the President of the United States on Twitter, so it’s not all rosy, but the point is that Twitter had this outsized power and part of that could be attributed … to this verification process that let a lot of high profile folks, prominent individuals, media organizations, other kinds of people in the zeitgeist or in the public eye, engage with a certain sense of security.” –@ubiquity75

How does Twitter sustain its infrastructure amidst the mass layoffs and resignations? (39:18): “We have good reason to fear that [Twitter’s] infrastructure is going to get considerably worse over time. [Musk has] fired enough of the people. … In a lot of ways, [Twitter is] like a telephone company. It’s got a lot of boring infrastructure that it has to maintain so that it’s reliable. [Musk has] taken a bunch of these pillars or blocks in the Jenga stack and knocked them out, and it’s a lot more wobbly now.” –@owasow

Musk’s Twitter user experience is not the common one (48:23): “[Musk is] obsessed with bots and spam, but why is that such a compulsion for him? Well, he has 100-plus million followers, and when he looks at his replies, there’s probably a lot of bots and spam there. That’s not where I live because I’m a civilian. His perspective is distorted in a way partly by the investment around him but partly also by just being so way out of proportion to almost any other human on Earth.” –@owasow

About Our Guests

Omar Wasow is an assistant professor in UC Berkeley’s Department of Political Science. His research focuses on race, politics, and statistical methods. Previously, Omar co-founded BlackPlanet, an early leading social network, and was a regular technology analyst on radio and television. He received a PhD in African American Studies, an MA in government, and an MA in statistics from Harvard University.

Ralph Spencer has been working to make online spaces safer for more than 20 years, starting with his time as a club editorial specialist (message board editor) at Prodigy, and then graduating to America Online. During his time at AOL, he was in charge of all issues involving Child Sexual Abuse Material or CSAM. The evidence that Ralph and the team he worked with in AOL’s legal department compiled contributed to numerous arrests and convictions of individuals for the possession and distribution of CSAM. He currently works as a freelance trust and safety consultant.

Sarah T. Roberts is an associate professor in the Department of Information Studies at UCLA. She holds a PhD from the iSchool at the University of Illinois at Urbana-Champaign. Her book on commercial content moderation, Behind the Screen, was released in 2019 from the Yale University Press. She served as a consultant, too, and is featured in the award-winning documentary The Cleaners. Dr. Roberts sits on the board of the IEEE Annals of the History of Computing, was a 2018 Carnegie Fellow, and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media.

Related Links


Your Thoughts

If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.

Leave a Reply

Your email address will not be published. Required fields are marked *