Trump’s Executive Order is a Danger to Online Communities

Black Lives Matter.

As community professionals and hosts, we have the power to cultivate thoughtful spaces online. We serve communities and, if you’re a regular listener of this show, I doubt you’re serving racists.

Systemic problems can feel overwhelming, but small things make a difference. Your community and how you manage it, regardless of the size of it, can be a part of the solution. I encourage you to think about that as you make choices that shape these platforms.

On May 28, a couple of days after Twitter added a fact-checking notice to one of his tweets, Donald Trump signed an executive order targeting online communities and platforms.

I believe that holding Trump accountable for his rhetoric and fighting white supremacy are the same fight. This executive order is designed to stop you, me, and big platforms from doing exactly that. On this episode, we’re talking with attorney Anette Beebe about the resulting fallout and answering some of your questions.

Among our topics:

  • What damage has Trump’s executive order done already?
  • How does this impact community moderation right now?
  • The publisher vs. platform “debate”
  • Does adding notices to content make you liable?

Our Podcast is Made Possible By…

If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.

Big Quotes

The importance of being able to hold Trump accountable for his tweets (1:13): “I believe that holding Trump accountable for his rhetoric and fighting white supremacy are the same fight.” –@patrickokeefe

Content moderation is hard (9:37): “It’s very tough to get content moderation right 100% of the time. It’s very subjective. We have to remember that these platforms, a lot of the big ones especially, aren’t just operating here in the U.S. They’re global. Global norms may be very different from what we’re used to here in the United States.” –@anette_beebe

Fact-checking notices don’t impact Section 230 (21:02): “Section 230 has always been about reducing liability or immunizing platforms from putting stuff up or taking stuff down. Adding more speech, [like Twitter did with its fact-checking notices], takes it away from Section 230 and starts getting into your First Amendment.

… Obviously, what the president tweeted out, talking about mail-in ballots, that’s where it started. [Twitter can] have a different opinion that, ‘No, this isn’t likely to cause fraud.’ What would someone sue for? Having a difference of opinion? That’s their First Amendment right. They can have a difference of opinion against the president. Thank you for living in the U.S. This is what we can do here.” –@anette_beebe

Editing content can cause issues (23:32): “Editing can [open you up to liability] if it’s materially contributing to the content or materially altering it. Removing a link isn’t materially altering it necessarily. I haven’t seen any cases where that has been an issue. There may be a Section 230 case that I’m just unaware of, but I’ve never seen that be an issue or fixing capitalization, some of those basic editorial things that one would do. Those have always been fine. It’s when you change ‘is a bad guy’ to ‘is not a bad guy.’ That would be materially contributing.” –@anette_beebe

Study is needed prior to legislation (31:35): “I have not seen any empirical studies done that would suggest, one way or another, that the [Section 230] harms we hear about, they’re so big that this is actually a huge issue. We’re on a 24-hour news cycle and people love dirty laundry.

… How many issues are there really? Versus how much we’re perceiving it’s a big issue because we’re hearing about it all the time? I would love to see some studies done that actually weigh these types of issues out so we can make better-informed decisions before we just put pen to paper and start legislating things without really having a full understanding.” –@anette_beebe

About Anette Beebe

Anette Beebe has worked in the legal field for over 21 years, spending close to eight years as in-house counsel, now general counsel for Xcentric Ventures, which operates one of the oldest consumer complaint forums, RipoffReport.com. Anette formed a solo practice in late 2012 that caters primarily to businesses that operate online and individuals who have concerns about online content. In her spare time, she works to educate youth and adults about repercussions from internet use through public speaking and online courses through her company, Smarter Internet Use, and blogs about fighting fair on the internet.

In April of 2019, she was involved with efforts to maintain the Texas Citizens Participation Act, the Texas anti-SLAPP law, and had a hand in the redraft of the modified law. Anette is a member of the internet Lawyers Leadership Summit Group, a co-chair of the Digital Communications Committee within the American Bar Association’s Forum on Communications Law, a member of the First Amendment Coalition and a member of the International Association of Privacy Professionals.

Related Links

Transcript

Your Thoughts

If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Antispam to reduce spam. Learn how your comment data is processed.