How would the internet change if Section 230 of the Communications Decency Act is repealed? For U.S.-based online communities and the professionals that work for them, not for the better. In fact David Greene, senior staff attorney and civil liberties director at the Electronic Frontier Foundation, argues that some websites and communities would disappear altogether. They simply wouldn’t be able to exist with the risk that republishing content could bring.
If you want to talk to your colleagues, your community, or your elected officials about how Section 230 protects everyone who uses and works on the internet, consider this episode your primer. Patrick and David also discuss misconceptions about Section 230 and why it’s important for all community professionals to pay attention to attempts to repeal this law.
Here’s what’s covered:
- The basics of Section 230, including who it protects and how
- How FOSTA intended to regulate sex trafficking and ultimately regulated so much more
- What elected officials are saying and hearing about Section 230
A summary of the protections that Section 230 provides: “Section 230 provides an immunity from liability for any internet … user or service provider for republishing somebody else’s content. Whereas, in another situation, you might have borne some legal liability because you republished something somebody else said or wrote, when you do that online, you cannot be liable for that. … [For example,] if you forward an email and the email you forward has awful stuff in it, you can’t be liable because the content you forwarded harmed somebody in some way. The original person who wrote the email can, but you just merely as the intermediary who forwarded the email cannot.” –@davidgreene
How Section 230 protects us all, not just “big tech”: “Section 230 protects any person or any entity that publishes other people’s content online. That is pretty much every person and entity that I know. There’s nothing in its language that limits it to big tech. I actually hear this a lot from my friends in the legacy news media community who feel like big tech got this statutory advantage that they don’t have. I always tell them, ‘Look, you publish online. You publish news online. Some of you are exclusively online news, and you enjoy the same protection when you are an intermediary, when you publish wire service stories, when you publish reader comments, when you publish advertisements. All these things are not your own content. You get the exact same protection that anybody, that big tech does, regardless of what that means.'” –@davidgreene
What community managers can do to protect Section 230: “Pay attention to efforts to limit Section 230. I think it’s actually really helpful even now just to contact your elected officials and to explain to them why it’s important to you. … They are constantly hearing from people who are blaming the whole litany of woes on Section 230 and who are talking about the bad anecdotes because Section 230 undoubtedly does protect bad actors. Any immunity is going to do that. It’s going to protect the good actors and the bad actors. Members of Congress tend to hear a lot of the bad actors’ stories. That’s what causes them to want to legislate.” –@davidgreene
About David Greene
David Greene is the senior staff attorney and civil liberties director at the Electronic Frontier Foundation. He has significant experience litigating First Amendment issues in state and federal trial and appellate courts and is one of the country’s leading advocates for and commentators on freedom of expression in the arts.
- David Greene on Twitter
- Electronic Frontier Foundation
- Section 230 of the Communications Decency Act
- Law professor Eric Goldman on Community Signal
- FOSTA and how it has impacted the internet
- The Internet Society
- Section 230 Is Not A Special “Tech Company” Immunity
- Ron Wyden, Ed Markey, Nancy Pelosi, Kamala Harris, and Elizabeth Warren
[00:00] Announcer: You’re listening to Community Signal, the podcast for online community professionals. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[00:00:22] Patrick O’Keefe: Hello and welcome to Community Signal. This is an important episode, especially if you’re in the US, but even if you’re not as what happens here will certainly have repercussions elsewhere. I’ve talked about Section 230 of the Communications Decency Act many times on this show. When law professor Eric Goldman was on Community Signal he called Section 230 “the number one most important law for the internet and for all community sites”.
230 says that you and your community aren’t liable for words written by someone else. If someone joins your community and posts something terrible, the person who posted that terrible thing is the one responsible. Section 230 empowers community management and moderation in this country. It was created to allow those who host speech to be able to curate, act with discretion and apply standards. It is crucial to our work.
But increasingly, legislators are discussing the prospect of Section 230 being repealed or weakened. It’s coming from both sides of the aisle but the Republicans are by far the loudest. A lot of the conversation from the right relates to well publicized bands of some conservative personalities from Facebook, Twitter and Instagram. Those bands are being spun anecdotally as persecution. The bottom line is that they just don’t like the curation, discretion and standards that are being applied. If they did, they wouldn’t even be raising the issue. They miss the obvious irony that they wouldn’t have a Facebook, Twitter or Instagram to be banned from if not for Section 230.
Repealing Section 230 is not about free speech. It’s about forcing speech on you. The beauty of Section 230 is that it allows both things you do and do not like. The freedom that allows conservatives to start an online community and moderate it as they see fit is the same freedom that allows for the creation of communities that welcome those with other views.
The people who want Section 230 to be repealed have a goal. They want to be permitted to say whatever they want, wherever they want without consequence and they’ll burn many good online communities to the ground to do it.
Thank you to our supporters on Patreon including Serena Snoad, Maggie McGarry and Jules Standen. If you’d like to join them, please find out more at communitysignal.com/innercircle.
Our guest is David Greene, senior staff attorney and civil liberties director at the EFF. He has significant experience litigating First Amendment issues in state and federal trial and appellate courts and is one of the country’s leading advocates for and commentators on freedom of expression in the arts. David, welcome to the show.
[00:02:42] David Greene: Thank you for having me.
[00:02:44] Patrick O’Keefe: It’s a pleasure. Gosh, I’ve talked about Section 230 so many times on this show. [laughs] I think my listeners might consider me to be a broken record at this point, but I really believe that we are nearing a scary moment for online communities. I don’t think that most people running an online community from corporate, to hobbyists fully appreciate the storm that’s forming on the horizon.
Over the years, I’ve seen people across the political spectrum float the idea of repealing or weakening Section 230, but I don’t know if I’ve ever heard it this loudly coming from people in power. We have legislators like Josh Hawley, Steve King, Ted Cruz talking about it. You have the president’s son writing op-eds in favor of it, cheered by the House Minority Leader Kevin McCarthy and, of course, you have the president along for the ride tweeting out vague missives and serving as a general hype man to the idea that something must be done about this “censorship”. I’ve been moderating online communities since two years after the Communications Decency Act was passed in 1996 and I can’t say I’ve been paying attention to how long necessarily, but for however many years I’ve been paying attention, I don’t think it’s ever been worse than it is right now. Right now feels like a particularly dangerous time and it feels like a time to be on high alert. Do you think I’m being over dramatic?
[00:04:06] David Greene: No. I don’t think you’re being over dramatic. I think that’s correct. I think most people who follow this issue would say that’s correct. I even think that if you would ask Section 230’s biggest defenders in Congress they would tell you that this is probably the most potent threat to the statute since its enactment.
I think in addition to what you’ve already said, the other important thing to recognize politically is that nobody seems to be happy with content moderation on the internet right now. Some people think that too much content gets removed or it’s removed for improper political reasons. Some people think that not enough content gets removed and everybody who’s unhappy focuses their attention and their unhappiness on Section 230.
Even though people might be coming at it for different reasons, everyone is fairly unified across the political spectrum in one thing there to be changed.
[00:05:01] Patrick O’Keefe: When you say that, the thing that really strikes me as being unfortunate is that when people think about moderation, and when they talk about it, when legislators talk about it, when reporters talk about it, it’s often in the context of Facebook and Twitter. When the reality is that the vast majority of online communities, of course, Facebook to me isn’t quite a community. It’s more of a social platform, but we’re all protected, but the vast majority of individual entities who are protected by 230, and really have sprung up because of it are small players, are the individual, one person community that is started by a hobbyist who loves that thing, and may not realize the protection that’s there, allowing them to moderate.
Those are the people who represent the vast majority of online communities, but because of the breath and the overwhelming presence of Facebook in our lives, when people think about it, they just really think about Facebook and miss out on the beautiful diversity of the internet, if you will, how section 230 has given birth to all of these great communities. If you take it away, it’s not just Facebook, it’s all these other people as well.
[00:06:06] David Greene: Well, you probably don’t even take away Facebook. I agree with you. People get angry at Facebook and Twitter and YouTube. They say, “Well, let’s go after section 230 because it must be protecting these companies. When you look at why section 230 was passed to address problems of scale of content, and to address through the hecklers view, the idea that someone might not have the resources to investigate every complaint that they receive. Those are really things that primarily protected small platforms, small intermediaries, and then users.
The reason why platform’s intermediaries are important and remain important is so that users don’t need to know how to program, don’t need to know how to code in order to get their content online. That’s really who Section 230 was really meant to protect.
When you look at really how Section 230 is used, that’s still who it’s used by. If we say we’re angry at the big companies, the big companies have the money to throw at those problems. They actually don’t rely on Section 230 practically, they might rely on it in the court, but they don’t rely on it in real life nearly to the same extent as the small companies and as individuals do.
That’s also why we see the big players capitulating to limit pr carve-out exceptions to Section 230 all the time. I think FOSTA was an example of that, where the Internet Society ends up supporting it, but the smaller companies represented by Engine and some of the other groups did not.
[00:07:35] Patrick O’Keefe: I want to take a step back. I don’t want to waste too much of our time together talking about basic concepts. I’m always surprised by who doesn’t know about Section 230 in my space. For my audience of people who manage online communities, how would you sum up Section 230?
[00:07:50] David Greene: Section 230 provides an immunity from liability for any internet, either user or service provider for republishing somebody else’s content. Whereas, in another situation, you might have borne some legal liability because you republished something else somebody else said or wrote, when you do that online, you cannot be liable for that.
[00:08:14] Patrick O’Keefe: For my audience, the really best use case example that I tend to come up with is, we launch a forum, we host that forum somewhere, we install software, we create a community, we moderate it, we have guidelines and standards, someone comes they post something awful. Because of Section 230, the host of that speech isn’t the one held accountable for that awful thing. It’s the speaker of those words, the author of those words?
[00:08:37] David Greene: Yes. Even more basic thing, maybe something most people can relate to even more, if you forward an email and the email you forward has awful stuff in it, you can’t be liable because the content you forwarded harmed somebody in some way. The original person who wrote the email can, but you just merely as the intermediary who forwarded the email cannot.
[00:08:59] Patrick O’Keefe: That’s really interesting. I want to switch gears to some of the things that you’ve commented on a bunch of times because I’ve read through your Twitter feed. I want to hit at some of the most common things that people say about Section 230 that are wrong. Because it’s not just random people getting it wrong, it’s our leaders, it’s people in the media, it’s people who have a substantial audience online saying these things that are wrong. I have three big ones.
[00:09:25] David Greene: Let’s see if I’ve [unintelligible 00:09:25] all my hair- [crosstalk]
[00:09:27] Patrick O’Keefe: I see, what’s left. Let’s start with the so-called platform publisher distinction. This is something that President Trump has amplified on Twitter by retweeting a tweet from the COO and co-founder of The Daily Wire who said, “Those suggesting Facebook can ban anyone for any reason, because they are a private company do not understand the platform publisher distinction or the special legal protections afforded to the former.” Is he right?
[00:09:54] David Greene: It’s the exact opposite. What Section 230 does is make the platform publisher distinction to the extent that even ever was a legally relevant distinction, but what Section 230 does is make it irrelevant. It says it doesn’t matter. It says even if you would have been a publisher, you’re not a publisher.
[00:10:11] Patrick O’Keefe: Just some examples of that, because I’ve talked to different community professionals. They say something like, “I’m posting in the community, I’m contributing, so I’m a publisher.” That’s sort of the standard that they sometimes will discuss or think about. That just makes no sense.
[00:10:27] David Greene: Yes. Under the law, a publisher simply means somebody who transmits information to one other person. That requires an audience of one person other than yourself and the person about whom you’re speaking. That’s all it takes to be a publisher under the law. It really means nothing more than that. Pretty much everyone is potentially a publisher. What Section 230 says is when you do this stuff online with somebody else’s content, you’re just not going to be a publisher.
[00:10:57] Patrick O’Keefe: No platform publisher distinction, check off that. Next.
[00:11:00] David Greene: Not at all.
[00:11:01] Patrick O’Keefe: Next, I want to talk about the idea of neutrality. Something I’ve heard and something I am seeing constantly on Twitter. Neutrality in the sense that a platform or an online community must subscribe to someone’s sense of neutrality. Someone’s sense of neutrality in order to be protected. I can’t tell you how many people I’ve talked to who think that moderating a community would somehow make them liable because they would be specifically choosing to remove some types of content, but not others.
Missouri senator Josh Hawley recently said, “I am rapidly losing confidence that Twitter is committed to the free speech principles that justify immunity under Section 230.” My question for you is, what free speech principles?
[00:11:46] David Greene: I think this is exactly backwards. There’s the Section 230 part. Again, what Section 230 says is neutrality is irrelevant. Section 230 specifically protects both decisions to moderate and decisions not to moderate. It doesn’t require neutrality, and it doesn’t punish you if you’re not neutral. That’s what the law says.
The right to not be neutral really has nothing to do with Section 230. That’s a right that originates in the First Amendment. In 1974, the Supreme Court found that a newspaper could not be compelled to be neutral. There was a law in Florida that was called The Right of Reply Law that said that if a newspaper editorially endorsed a candidate for office, it needed to give space to respond to the opponents.
The Supreme Court struck that down as unconstitutional interference with editorial discretion. The right to not be neutral existed way before Section 230 and is a constitutional right. All that being said, I do agree that there are some human rights implications that it’s worth considering at least that people are losing platforms. They are losing the ability to speak. Platforms have the constitutional right to do that, but I do think there are human rights implications to that, that people can be legitimately concerned about. There’s just not really a remedy for it under US law.
[00:13:12] Patrick O’Keefe: I’d like to say that Section 230 empowers us to moderate because a lot of what community management is, tends to fall into the category of curation, discretion, and moderation, and deciding, say, this is appropriate behavior, this isn’t. At the scale that most communities operate, which is much smaller, the societal implications, the human rights implications tend to be lesser than, say, Facebook because of the sheer dominance that they have in this space.
Basically, what I hear a lot of people saying now even legislators, is more or less bullet down to my friend got banned, I don’t like it. Therefore, something must be done. This is not uncommon to say an argument that everybody who has ever managed an online community with any sort of guidelines has heard since the dawn of time, since the first communities going back to like The Well in the ’80s. Which is that you, banned me because of X. It’s so unfair, and you’re awful, and this is bad.
I’ve had people say I banned them because they use the wrong web browser. Which was not the case. I wasn’t doing that. Section 230 is really the thing that says, we can have some discretion on a community. The idea of neutrality, to go back to the starter for this point, as you said, that’s the whole purpose of Section 230 was to incentivize people to actually take action, right?
[00:14:35] David Greene: It certainly was one of the purposes. The specific context is that there are certain legal advantages to just being a conduit of information. To not touching the information at all that flows through your message board or your forum or whatever it is. There are certain legal advantages to that.
The members of Congress who back in the ’90s were concerned about pornography on the internet saw that as being a disincentive of sites to filter out pornography. They wanted to remove that disincentive and that’s why Section 230 had the support of that political wing. Those people who really were solely concerned about pornography on the internet.
Now, I think that aside from those some very specific legal disadvantages, I think people always had the First Amendment right to not have to carry other people’s speech. That idea of we want people to be able to filter content and not even have to get to the constitutional defense was really one of the motivating factors behind Section 230 in 1996.
[00:15:37] Patrick O’Keefe: The third thing that people say that is wrong that I want to tackle the last misconception. We talked about this a little bit, but would love to talk a little more. A Section 230 is simply a big tech immunity. That’s a phrase that for some reason has really popped up a lot for me, big tech immunity. Is it?
[00:15:53] David Greene: No. I just wrote a piece about this on the EFF blog. I hear that all the time too and there’s nothing about in the language of Section 230. It’s immunity for anybody. Anybody who publishes online, it’s immunity for. That’s how it’s used in the internet, that’s using any type of interactive computer service. That’s why I used the example of email before.
Section 230 protects any person or any entity that publishes other people’s content online. That is pretty much every person and entity that I know. There’s nothing in its language that limits it to big tech. I actually hear this a lot from my friends in the legacy news media community who feel like big tech got this statutory advantage that they don’t have and I always tell them, “Look, you publish online. You publish news online. Some of you are exclusively online news, and you enjoy the same protection when you are an intermediary, when you publish wire-service stories, when you publish reader comments, when you publish advertisements. All these things it’s not your own content. You get the exact same protection that anybody, that big tech does, regardless of what that means.”
You look at the language of the statute, there’s nothing that limits it to big tech. Then you look at how it’s actually used there’s a lot of cases where the things other than big tech are actually using Section 230. I haven’t done a qualitative or quantitative analysis, but it seems just from the work I did do that it’s actually the majority of cases where Section 230 is used. It’s not used by big tech. It’s used by news media, it’s used by individual users.
Again, there’s a bunch of cases about email forwarding. It’s used for people who posted to forums and newsgroups and things like that. So, both by its language and in practice, it’s not a big tech immunity.
[00:17:49] Patrick O’Keefe: It tends to get pigeonholed and I think you do a good job of saying, back up, it’s really all of these things. It’s interactive computer service. It tends to get pigeonholed into like, Facebook banned breastfeeding or something like that. That’s like why can they do that? Well, Section 230. Let’s repeal Section 230. Where it’s online reviews. It’s the small forum that someone posted a review on that the big company hates and so it makes it easier for them to bully that forum to pull it down. It’s online criticism. It’s so many different things beyond say a categorical ban by Facebook, but I want to talk about this a little bit and also we’ll link to that EFF piece about the big tech immunity because I really liked it. We’ll link to that for anyone who’s listening.
I want to talk about big tech briefly, because you mentioned earlier even the people who support Section 230 are not loving the state of it. Let’s say and have talked about things. Obviously, Senator Ron Wyden being one of the co-authors of Section 230 and himself sounding the alarm I think it was last year to say, if Facebook doesn’t get their hands on this or get a handle on this, Section 230 could come into play.
That’s obviously a concerning thing. It’s not even to say the Republicans are the only ones talking about this or adjacent topics because they’re not. Nancy Pelosi has discussed Section 230 in loose terms. Kamala Harris was an important player with FOSTA which you mentioned before and has vaguely talked about holding social media platforms accountable for “hate”. Elizabeth Warren wants to break up big tech companies above 25 billion in revenue and make them utilities. That doesn’t necessarily have to touch 230, but you hinted at some concerns about human rights. Are there areas where not touching Section 230, but things that Facebook or that these large platforms should be looking at or should be held accountable for without touching that legislation? Also, just to provide a wider picture there is, I’ve always felt that the biggest threat to 230 was Facebook. I thought that way for a long time.
Facebook getting so big and so important that it would draw that criticism in a way that would be directed at 230 for good or bad reasons. Are there things that should be done here with these big platforms that don’t touch Section 230?
[00:19:58] David Greene: Let me turn your question around to slightly unsatisfying way I’m sure.
[00:20:02] Patrick O’Keefe: Okay, go for it.
[00:20:03] David Greene: You should certainly look for things to do against the big companies that don’t have section 230 because I think they will readily accept anything you offer them about section 230 because they are the few that have the resources to hire thousands of people to actually look at all the content, which most of the proposals would require.
It’s actually to their competitive advantage to actually have duties attached to intermediaries because they can handle those duties and smaller companies cannot. Not only do I think that it’s possible to address your concerns without touching section 230, I think it’s probably not possible to address your concerns.
You’re not going to effectively address your concerns if you go at them by trying to limit Session 230 which again is a benefit largely for people other than Facebook and YouTube and Twitter and the big huge players that people are mad at.
What can you do then? I don’t have an opinion on breaking up companies or things like that, but I think you need to look at what’s actually bothering you and then figure out how to address that. Again, everybody comes at this, everyone has their own beef about it. There certainly are limitations, I do believe these sites have First Amendment right to curate their sites even if we might sharply disagree with their curation decisions.
Sometimes I think there may not be a legal remedy to what you want to do, you may have to try and go about it another way. Facebook in particular seems to be fairly susceptible to public shaming. That’s probably brought on more for Facebook more than anything else. I don’t have the answer to what the alternative to Section 230 is, but I can say that to try to address those problems with the huge platforms by limiting section 230 would actually, be counterproductive.
[00:21:54] Patrick O’Keefe: If you eliminate section 230 it’s going to hurt the people who can’t afford to defend themselves as much as Facebook can.
[00:22:01] David Greene: If you’re concerned about competition, you’ve just handed them a huge competitive advantage.
[00:22:06] Patrick O’Keefe: There have been efforts over the years to weaken, carve out Section 230 and you mentioned one that we haven’t talked about on the show at all, FOSTA. I wondered if you might talk a little bit about the lessons that we’ve learned from those efforts. You can limit that to just FOSTA if that makes it easy enough, what lessons have we learned post FOSTA in that, let’s say, well-meaning effort to target something and what the fallout has been for that?
[00:22:32] David Greene: Yes, I could go into a little bit background of FOSTA.
[00:22:36] Patrick O’Keefe: Sure, please.
[00:22:37] David Greene: FOSTA is the acronym. FOSTA is for the Fight Online Sex Trafficking Act. It was a law that was passed last year that carves out immunity from Section 230 in a few different ways and try and summarize that in three ways. Section 230 already had an existing exception and had since it was originally passed for violations of federal criminal law. One of the ways FOSTA expanded that was it created a brand new federal criminal law which made it a crime to use online services to promote or facilitate the prostitution of another person. That’s something that was not previously illegal under federal law. That’s one thing it did.
Another thing it did was there was an existing law that had been passed four years prior, it’d be three years prior called the Save Act with specifically targeted advertisements for sex trafficking. Sex trafficking is different than prostitution, whereas sex trafficking specifically refers to offering somebody else for sexual services either by use of force, fraud or coercion or if the person’s under 18. It essentially said that if you participate in a venture of sex trafficking online, you knowingly do so, then that’s also a federal criminal law and again because it’s a Federal Criminal Law, you don’t get Section 230 protection. There were special provisions in that for advertising specifically.
[00:24:07] Patrick O’Keefe: That sounds reasonable enough. Can we all get behind that, is the thought?
[00:24:11] David Greene: That one you’re like, “Yes, yes.” What FOSTA did was it actually defined this idea of participation or venture to mean knowingly assisting, supporting or facilitating. It seemed like Congress was actually trying to take this existing very narrow exception and make it broader in some way.
We’re not quite sure what they did, there are two different knowledge standards. It’s not quite sure which one applies and whether protections were lessened or not. The less that FOSTA was actually took Section 230 itself and said, not only are we going to have these exceptions for these violations of federal criminal law, these are like you get prosecuted by the Department of Justice. We’re actually going to allow for civil action, so private litigants to sue and we’re also going to allow for state criminal prosecutions as well. There’s a little bit more detail involved in that, but that’s sort of the general sense of what the amendments do.
We have one expanded existing federal criminal law, one brand new federal criminal law, and then we permit both private civil lawsuits and prosecutions by states under both of those new laws, that’s what FOSTA did. What effect has it had? Well, what we’ve seen is a lot of sites just shutting down because they fear prosecution under FOSTA and these are not child sex trafficking sites. The main site accused of sex trafficking backpage.com was actually shut down under existing law, law that existed prior to FOSTA.
If you read anybody talking about FOSTA and why did we pass, you’re not going too far in what they’ve written before they start talking about backspace.com, but what they proved is that they didn’t need the law to shut down that page.
What we have seen, probably the most famous example is Craigslist dropped its therapeutic services section and its personal section, it directly attributed those to FOSTA saying that its burden of having to monitor those sites, to comply with the law was something it could not handle.
Reddit has talked about the difficulties it’s had having to drop certain subReddits, Cloudflare had to – this is not social web infrastructure – had to drop some sites that offer web posting in DNS services too. We’ve seen a lot of sort of fetish sites shut down because they could not bear the risk of publishing.
We’ve had sex worker advocacy organizations have to sharply curtail their online presence. These are organizations that either advocate for the decriminalization of sex work or advocate for the rights to provide health and safety information to sex workers. They’ve had to sharply curtail what they’re doing. That’s what you’ve seen. You’ve seen this law that Congress has purported to only be targeting sex trafficking has really had a much broader effect that affects people who think sex trafficking is awful, but have lost forums for discussing what they value.
[00:27:03] Patrick O’Keefe: I appreciate you kind of walking through that story. That’s exactly the reason I asked for it is because I read a piece that I think you wrote about it or someone had wrote about it on the FFA website where it’s a perfect example of something that sounds reasonable, like we can all get behind it, but in practice, the people who end up going offline are the people who again, to draw from our earlier conversation, that can’t or won’t be able to fight against something that crops up.
If it’s kind of an ancillary topic, if it’s related in this case to sex workers, there’s a fear that, depending on also where they’re based and sort of the state they’re in and any number of things will be friendly or more hostile to this depending on who is in office or I don’t know who’s the attorney general, whatever, they’re just disappearing off the Internet. Even the well-meaning things that seem obvious can have really negative repercussions once they’re implemented.
[00:28:00] David Greene: That’s right. I think, and part of the consequences of FOSTA and what advocates told Congress was that they ran a law that was far broader than what they were doing, just the consequence of opening up liability to private people and state attorney general. It’s tremendously burdensome to defend even a meritless lawsuit. An individual might be able to handle the risk that they’re going to get prosecuted by the US attorney, they’re thinking, “Look, I’m way down on low priority of what the federal governments prosecute, going to go after, but that completely changes once you open up to civil litigation, it’s just a completely different risk model because the bar to filing a civil lawsuit is so low and it’s completely predictable.
What happened is that intermediaries were just going to choose just not to post certain speech if it’s going to have any risk that they’re going to get sued for. Again, even if those lawsuits would be meritless and they could win them, it’s just having to defend the lawsuit is very, very resource intensive both time and money.
[00:29:03] Patrick O’Keefe: I don’t want to sound like I’m wearing a tin foil hat here, but anecdotally and just thinking about Section 230 and searching forward and reading what people are saying online, I am noticing more and more generic interest on social media on the idea of repealing Section 230.
For example, if you search Twitter for Section 230 which you do if you have no life or Facebook publisher platform, you’ll see a bunch of tweets from accounts that some might dismiss as bots or just accounts with like no following promoting this idea and many of them tagging the president. This has really got me thinking who benefits from this idea? We’ve talked about how big companies Facebook would benefit in a competitive sense, but who benefits from Section 230 being repealed? Do you have any thoughts on that?
[00:29:45] David Greene: Who benefits from it being repealed? Well, again, I think the big platforms benefit, get a competition advantage from it being repealed. People who don’t like other people’s speech will benefit. I’m not sure who that is. People who don’t use the internet may benefit, who feel scared by the internet but they don’t themselves use it.
I firmly believe I don’t think Section 230 is the Bible or anything, but I do believe that it does ultimately serve users and that we have an internet that’s based on intermediaries. Without Section 230, we’re just going to have a different internet. I think that’s going to require all users have a different level of technical skill than they currently have. I think I agree with your premise that because we all benefit from Section 230, if we get rid of it, we’re just got to figure out a different way of using the internet the way most of us like to use it.
[00:30:37] Patrick O’Keefe: It’s interesting for me to fathom that I’m not sure I know what that looks like, but let’s say, Section 230 is repealed tomorrow. What happens?
[00:30:45] David Greene: I think a lot of sites disappear. You’ll probably lose sites like, Yelp. The review sites first. Then you might lose classified advertising sites like, Craigslist. The things that really, really rely on other people’s content probably going to lose sort of library and archive sites like the Internet Archive, which again just are massive amounts of other people’s content. These are new entities that are either non-commercial or aren’t large enough, or are commercial but aren’t resource rich.
Then I think there’s going to be a slow thing about once people realize that they can get sued themselves for doing things like fowarding email or they themselves posting something on Facebook. Section 230 doesn’t just protect Facebook, but if you on your Facebook page post an article somebody else wrote, you can get sued for that. Section 230 actually protects your ability to post other people’s stuff on Facebook. At some point, once people start getting sued, which is inevitable, you’re going to start seeing that disappear.
I think the other thing is that I’m not as concerned that section 230 would be completely just wiped off the books as much as there will be attempts to just either to additional subject matter for carve-outs like, FOSTA or that there could be some attempt to impose some type of duties on intermediaries. I most frequently hear of this in terms of, well, if you’re okay as long as you don’t know it’s there, but once you know it’s there, then you lose the immunity. I think that is a real possibility and I think it’s a dangerous possibility.
[00:32:18] Patrick O’Keefe: Just thinking about moderation, that’s fairly ridiculous. I think a lot of online communities like the ones that I’ve worked with and the ones that listen to this show, would probably immediately switch over to sort of all positive content all the time. I think the tone and tenor of criticism online would be dramatically changed. I know from my communities, I run a martial arts community outside of my day job. I don’t need to field random inquiries from every equipment manufacturer for the martial arts that has a problem because someone didn’t like their boxing gloves or the way they were tied.
[00:32:54] David Greene: I think no one in their right mind is going to carry any type of consumer review or host a site that’s likely to have that type of content. Even if that’s not your sort of what they intended.
[00:33:06] Patrick O’Keefe: It sounds like you’re more concerned with the, to borrow a phrase, repeal and replace. We’ve seen how well that’s gone to repeal and replace something like Section 230. Do you think the folks who wrote Section 230 and kind of that era of legislation–?
Ed Markey, senator from Massachusetts came to my apartment building of all places to speak to our residents when they were getting support for net neutrality. I’m in Hollywood and there are some people with Twitter followings and substantial followings in my building. That was sort of the genesis for the meetup. I asked him about the CDA. We talked a little bit about 230.
He mentioned the idea that they didn’t know where it would go. They just wanted to provide a playing field for people to innovate on. Do you think that they sort of got lucky and how that it’s worked out okay, not great? They left an opening for it to be successful or do you think they really had some foresight? It’s kind of an odd question, but just curious where you think that kind of state of legislation.
[00:34:01] David Greene: I think it’s a mix. I think it’s functioning exactly the way some members of Congress thought it would, and it hasn’t functioned the way other members had. Again, these people who were solely concerned about making sure that platforms, CompuServe, could get pornography or sexual content off their sites. I think they’re surprised that the law has really gone beyond that.
I think that a lot of the proponents of the law, who were sort of really talking to the tech community, I think it’s done exactly what they thought it would. I think it’s also had the results of really allowing the internet to become the predominant communication spot.
I think it’s hard to imagine that not happening, but there really were huge legal obstacles to having a large-scale global communications platform and Section 230 removed those. I think we have no idea what the internet would look like without it. We really have no way of going back and knowing how it would be without those legal barriers, again, the idea that scale of global communications and sort of your near instantaneous global communications is just not something that the legal system had a construct for. Just creating immunity sort of allowed that to happen. If that hadn’t existed, I don’t know where we’d be, what the internet would look like.
[00:35:23] Patrick O’Keefe: When I was talking to Senator Markey, I mentioned how Section 230 is, in a way, the basis for my job like what I do, and there are careers that are dependent on it. The President is very focused on coal miners, I kind of think there are as many people that do what I do as there are coal miners. I’ve used that example before to say, you know what? You’ve got to think about this. If you look at it, as a job creator too, as something that leads to people having a role and responsibility that wouldn’t exist because Section 230 protects me today, as it protected a 13-year-old me when I started moderating communities in 1998.
Section 230 is important to me, because of the work I do. The EFF is focused on digital rights and human rights online. It’s important to the EFF. We’re making the case for that it’s important everyone to use the Internet, right? These are overwhelming times news cycle wise, to say the least. It’s easy to see how Section 230 might slip under the radar in favor of so many other issues.
For people listening, for people who work in this space, who are the people managing these platforms, and these online communities, small and large, what would you say is the best way for online community pros and leaders to remain vigilant and to raise their voices on this issue?
[00:36:37] David Greene: Well, I think to pay attention to efforts to limit Section 230 I think it’s actually really helpful even now just to contact your elected officials and to explain to them why it’s important to you. I think that can always help. I think they are constantly hearing from people who are blaming the whole litany of woes on Section 230 and who are talking about the bad anecdotes because Section 230 undoubtedly does protect bad actors. Any immunity is going to do that. It’s going to protect the good actors and the bad actors. Members of Congress tend to hear a lot of the bad actors’ stories, that’s what causes them to want to legislate.
I think as many of the good actors’ stories, they can hear the better. You can certainly follow our website and our blog, we try and write about keep people informed about threats to Section 230 as well as a whole host of other issues. Occasionally, we’ll have action alerts about things that people can take in order to try and impose specific piece of legislation or comments on certain efforts.
[00:37:39] Patrick O’Keefe: I think as community pros listening to this, as we kind of see this issue and see it becoming a problem it’s not just us contacting our representatives, but in some cases, it might be us talking to our communities that we manage and saying, “Hey, if this goes away, we might go away too,” and asking them to mobilize and to contact their representatives to make sure they fully understand the issue.
David, thank you so much for taking some time for us today. I’ve really appreciated the opportunity to learn from your expertise and thank you for taking the time to answer my questions.
[00:38:08] David: I was happy to help.
[00:38:09] Patrick O’Keefe: We’ve been talking with David Greene, senior staff attorney and civil liberties director for the Electronic Frontier Foundation, visit them eff.org. Follow David on Twitter @davidgreene, Greene is spelled G-R-E-E-N-E.
For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead. Until next time.
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.