Section 230 of the Communications Decency Act is a frequent topic of conversation on Community Signal. As Patrick puts it, if you’re a community professional in the United States, “this is the law that places the liability for speech on the author of that speech, not on you as the [community’s] host. It allows you to moderate and remove certain content while not assuming liability for what remains. I like to think of it as the legal basis for our profession in the US, and it is an important legal protection against the wealthy and powerful who would happily take down an entire online community for one post they don’t like.”
Plainly, this is a law that protects our jobs, our communities, the people in those communities, and their right to have civil and safe discussions online.
For this episode of Community Signal, we invited past guests to share how Section 230 has enabled them to foster community and what changing Section 230 could do to the fabric of online communities.
Our Podcast is Made Possible By…
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Discourse.
“At The Times, Section 230 allowed us to build a modern news operation where we could have a public back and forth with our readers; an ongoing one. It gave us the chance to respect them by setting rules for engagement and privileging those who spent the time to be thoughtful about the news. By allowing us to pre-moderate, and not be legally liable for any mistakes we may have made in that process, [that’s] really what made our community operation at The Times economically feasible.” –@BasseyE
“As co-founder of a community software company, I personally rely on [Section 230’s] protection every day. If we go down the path of adding caveats and exceptions to Section 230, we risk losing it altogether. Yes, online content is messy – so is freedom, so is free speech. If we still believe in those things, we need to protect the innovators, not squash them under a burden of regulatory red tape or lawsuits.” –@rhogroupee
“Yes, online communities need to be moderated and cared for and that is the precise reason why Section 230 exists, to empower moderation without creating liability. If Section 230 goes away, the main groups to benefit will not be the most vulnerable users of the internet, it will be the huge platforms uniquely possessing the resources to be compliant with whatever new regulation that they will have helped to craft.” –@losowsky
“Without the protection of Section 230, a well-meaning person or organization may lose their right to maintain a clean well-lighted space for civil discussion and capitulate to every demand regardless of its merit. An organization facilitating online community may decide to close their community altogether deciding that the risk is not worth the benefit. Who really loses when we threaten the opportunities to build meaningful communities that can have a positive impact on people’s lives?” –@scottmoore
About Our Guests
- Bruce Ableson, director of evangelism and enablement at Adobe
- Gail Ann Williams who consults on community conversation and craft beer, formerly of The WELL and Salon Media Group
- Bassey Etim, editorial director at Canopy, formerly of The New York Times
- Rosemary O’Neill, president of Social Strata
- Andrew Losowsky, head of Coral at Vox Media
- Scott Moore, community veteran with a focus on nonprofits
- Michael Wood-Lewis, co-founder and CEO of Front Porch Forum
- Angela Connor, founder and CEO at Change Agent Communications
- Sponsor: Discourse, civilized discussion for teams, customers, fans, and communities
- Section 230 of the Communications Decency Act on Wikipedia
- Law professor Eric Goldman on Community Signal
- The Electronic Frontier Foundation’s civil liberties director David Greene on Community Signal
- Patrick and Scott Moore discuss Section 230
- Bruce Ableson, founder of Open Diary
- Gail Ann Williams, formerly of The WELL and Salon Media Group
- Zen kōan
- Bassey Etim of Canopy, formerly of The New York Times
- Rosemary O’Neill, co-founder of Ultimate Bulletin Board and Hoop.la
- Andrew Losowsky, head of Coral at VOX Media
- Scott Moore, a 20-year community industry veteran with a focus on nonprofit
- Michael Wood-Lewis, founder of Front Porch Forum
- Angela Connor, founder and CEO at Change Agent Communications
[0:00] Announcer: You’re listening to Community Signal, a podcast for online community professionals. Sponsored by Discourse, civilized discussion for teams, customers, fans, and communities. Tweet with @communitysignal as you listen. Here’s your host, Patrick O’Keefe.
[0:24] Patrick O’Keefe: Hello and thank you for joining me for this episode of Community Signal about Section 230 of the Communications Decency Act, a pivotal US law for community builders. But even if you aren’t in the US, it’ll benefit you to understand the legal climate for communities here and how it could influence your own.
I find there is a real blind spot with many community pros when it comes to the laws governing our work, but when it comes to the law, knowledge truly is power. It’s not something to be afraid of. I’m thankful to our Patreon supporters for seeing the value in having conversations like this. Among this group is Carol Benovic-Bradley, Luke Zimmer, and Marjorie Anderson. If you’d like to join them, please visit communitysignal.com/innercircle.
We’ve talked a lot about Section 230 here on the show. The law passed in 1996 as part of the Communications Decency Act, authored by former US representative Christopher Cox, a Republican, and current Oregon Senator Ron Wyden, a Democrat, then in the House of Representatives. Both of which I’d love to have on the show at some point.
A couple of our must-listen episodes about 230 include conversations we’ve had with law professor Eric Goldman, which also focused on how companies have used the US court system to silence speech online, and EFF civil liberties director David Greene, which was all about 230. There was also an episode where the tables were turned and past guests got more lead hosting duties, talking with me about the influence community pros and hosts have and how we should utilize it if Section 230 was threatened.
It was the most recent of those episodes, the one with David Greene in May that was spurred by increasing threats to the law from both sides of the aisle, through haphazard proposals focused on hurting big platforms like Facebook, but not fully taking into account the collateral damage that could be done.
Suffice to say, 230 isn’t that complicated. If you operate in the US, this is the law that allows you to exercise discretion about how you manage and moderate your community. It places the liability for speech on the author of that speech, not on you as the host. It allows you to moderate and remove certain content while not assuming liability for what remains. I like to think of it as the legal basis for our profession in the US, and it is an important legal protection against the wealthy and powerful who would happily take down an entire online community for one post they don’t like.
We’ve heard from some incredible legal minds on this show, but a few months ago, I decided I wanted to assemble a collection of perspectives from people who work with online communities every day. I reached out to a number of our past guests and ask them to share their thoughts on 230. I was thankful to be able to assemble a group of people that I really respect. These perspectives are the focus of this episode.
[00:03:24] Bruce Abelson: When I think about what Section 230 means to me, it obviously was hugely important back then when we started Open Diary because as we looked at this evolving landscape, and how we would grow a community, from a few hundred people to several thousand people to several million people, there were all sorts of constraints. There were things that were unknown. There were things that we knew we could do or might not be able to do, but one of the hugest concerns was a legal one.
There had been a recent court case decided just about a year and a half before that had held a higher standard for online publishers and said that if an online publisher moderates or edits the content that their users post on their site, then they can be held legally liable for any offensive content that is posted by anyone or any libel that’s posted by anyone on that site. That decision was hugely chilling to what was going on in the late ’90s around social networking and user-generated content.
Companies like mine had no idea if we’d be able to expand and create this new thing when we had to be afraid that somebody may come along and sue us because a comment or post had said something that offended them, that we had not created. Section 230 came along and fixed that because what Section 230 said was, if a company is moderating or editing the content that other people are posting on their site, that does not make them legally liable for everything that is posted on their site and that was the key piece of it.
It wasn’t that Section 230 was put in place to protect companies and allow this “wild west” to develop where anyone could post anything but Section 230 was created to give us the ability and the freedom to moderate and try to create better communities and have people post better content. That was what allowed social networking to take off and grow to what it is today.
Today, do we need to do a better job as an industry to moderate and regulate what we see our users posting and how things like bad actors are creating fake news on our sites that influence elections? Absolutely. Does the government have a place in that regulation or should Section 230 be changed to force companies to do certain things or moderate in certain ways? I don’t believe so. That’s a huge conversation that would take much longer than the time I’ve got today.
[00:05:59] Patrick O’Keefe: Going back even further than Bruce is Gail Ann Williams, who took over pioneering online community The WELL back in 1991, and was able to see a web without 230 and with 230 having later led community for Salon Media Group, and continuing to consult on community with numerous companies over the years.
[00:06:19] Gail Ann Williams: I became involved with Section 230 of the Communications Decency Act as it emerged. I was managing the conferencing and community side of The WELL starting in 1991. Then continued under three different ownership groups all the way through to 2015. The WELL already really had grappled in a really rather vigorous and sometimes a maddening way with the responsibility of the individual and what the organization, the business, could do to control or not control individuals and rather famously at the time came down to something called “you own your own words.” Which was interpreted in many different ways, turned around and became almost like a Zen kōan for some people.
Really, from the business point of view, we saw it as basically saying that business doesn’t claim anything, any rights over your creations and isn’t going to say, we are doing a compilation, and we can publish things without your consent. On the other hand, you are totally responsible as an individual for what you do. However, that wasn’t really enough without any legal underpinnings.
When Section 230 came about, that gave a really nice, comfortable zone. The fact that we didn’t have to think about liability. We already didn’t believe we should be thinking about liability, but we really didn’t have to think about it in most situations, and we still had some ability to control the tone, that was always the push and pull at The WELL. To some degree, it could be. It really isn’t now, because well.com now, it’s kind of a smaller group that has a lot more cultural homogeneity than it did at one point. I think if a bunch of newcomers came there that these things could come into play again and they may. You don’t know, as people go through the internet and settle in different places, and make friends, how they’re going to stretch the culture.
Our big concern was always, at what point do you meddle and say, “This person is causing too much chaos and is really driving so many members crazy that they need to be ejected?” The WELL did this very, very occasionally, and usually they’d go a long time before that happened and really watch the community process to see whether this person who is a troublemaker could be accommodated.
Then there were certain things that just weren’t tolerable and I think that should be the case in a community that’s managing things. I always thought it was a little bit curious the idea that you’re not supposed to meddle very much, but you can meddle a little bit. That was never very clear so we just did the best we could and tried to make a judgment call. I think that that was incredibly important for the development of online culture.
[00:09:31] Patrick O’Keefe: It’s that portion at the end that really jumps out where Gail speaks to her thoughtfulness, the ability to make a judgment call and how that was important for the development of online culture. Section 230 really took something that existed in the early web, people building social spaces and managing them practically and thoughtfully, and codified it into law empowering people to continue to do so in what was in the mid to late 1990s an increasingly litigious climate for speech online as companies started to take the internet more seriously.
I’d like to take a moment to mention our great sponsor, Discourse, who we’re really glad to have.
Discourse allows you to brand your community and enhance engagement through an extensive theme and component ecosystem. With their powerful moderation tools, you’ll be able to keep discussions civil, and you’ll never be locked into licensing costs or hosting.
Whether you have a community of 10 or 10,000, Discourse scales with you. And thanks to their out of the box community health metrics, you’ll be able to measure success. Visit discourse.org to sign up for a 100% free, never-start-your-subscription-automatically trial. Use coupon code CS2019 to get 50% off your first 2 months on a Standard or Business plan.
[00:10:50] Bassey Etim: At The Times Section 230 allowed us to build a modern news operation where we could have a public back and forth with our readers, an ongoing one. It gave us the chance to respect them by setting rules for engagement and privileging those who spent the time to be thoughtful about the news. By allowing us to pre-moderate, and not be legally liable for any mistakes we may have made in that process, is really what made our community operation at The Times economically feasible.
Here at Canopy, what we hope is that 230 will allow us to put powerful machine learning tools in the hands of the people so that everyone has a chance to compete with the big platforms and find their own independent audiences. We could do that without the fear that someone who opposes the speech on our platform could use an individual lawbreaker as a vector to sue us out of existence.
[00:11:57] Patrick O’Keefe: The revenue struggles of news organizations are well-documented but when you think of outlets with resources, one of the first that probably comes to mind is The New York Times. Bassey is saying that Section 230 is what made the community operation “economically feasible” and is a big deal. What that means to me is that without 230 it wouldn’t have made sense for The Times to even allow reader comments online due to the cost of defending themselves legally.
Bassey led a community task that he grew to around 15 people, which to this day is the largest online community focus team within a media organization that I have heard of. Bassey, also highlights how 230 enables competition leveling the playing field with big platforms. If this protection is weak and the people and organizations left are those with the money to defend themselves, which drives smaller players and individuals out of the market.
[00:12:53] Rosemary O’Neill: What I see happening now with the CDA and then on politically neutral content moderation that’s been introduced, is a legislative approach to something that should simply be litigated by the injured parties. That’s a cause for legal action, not federal government intervention. As Section 230 currently stands, infrastructure providers, software developers, and the makers of tools for content publishing are not liable for content that’s published using those tools. This protection has made it safe to innovate and develop countless outlets for online discourse and free speech.
As co-founder of a community software company, I personally rely on that protection every day. If we go down the path of adding caveats and exceptions to Section 230, we risk losing it altogether. Yes, online content is messy, so is freedom, so is free speech. If we still believe in those things, we need to protect the innovators not squash them under a burden of regulatory red tape or lawsuits.
[00:13:56] Patrick O’Keefe: One of the things that Rosemary alluded to is how Section 230 is a tool that empowers free speech. Sometimes people, especially those banned from an online community, get angry and think that moderation is an infringement on their free speech but 230 is a form of that same freedom because it allows us to curate spaces where different groups can feel welcome and if you don’t like it, you can always create your own community. Free speech doesn’t mean that you’re able to spill whatever you want, wherever you want without consequence.
[00:14:29] Andrew Losowsky: We provide online community software and resources to more than 65 newsrooms around the world, including The Washington Post and The Wall Street Journal. Section 230 is what underpins online community as we know it in the USA. At Coral, we work with newsrooms around the world. In countries with fewer protections such as Australia, it’s extremely difficult for hobbyists and small companies to run their own communities because of the legal liability that brings.
Yes, we do need to do better with online communities. Yes, online communities need to be moderated and cared for and that is the precise reason why Section 230 exists, to empower moderation without creating liability. If Section 230 goes away the main groups to benefit will not be the most vulnerable users of the internet, it will be the huge platforms uniquely possessing the resources to be compliant with whatever new regulation that they will have helped to craft.
If we want to make online communities safer what we need is rules that make it easier to enter the market place and disincentivize venture capital and its mentality of short term growth at all costs, which create negative incentives for safer communities. By introducing such rules we can enable a more diverse range of companies, communities, and non-profits to create safer and better spaces for online interaction, all of which will need to be protected from frivolous lawsuits by the shield of Section 230.
[00:16:15] Patrick O’Keefe: Andrew has talked extensively with news organizations around the world, so when he explains how difficult it is for hobbyists and small businesses in other countries without the protections like those afforded by 230 to run online communities, that’s something worth heeding. I was also struck by his comment about how the most vulnerable users of the internet stand to lose from a weakening of Section 230 as it could impact their ability to create safe inclusive spaces online. Next up is twenty-plus year community veteran, Scott Moore who has extensive experience with non-profits.
[00:16:48] Scott Moore: Recently there’s been a lot of public discussion about how Section 230 of the CDA is protecting large tech companies that don’t need protection. We cannot forget that Section 230 protects online communities of any size and type from liability just because of what someone happens to post. I’m talking about communities such as, parents advocating to hold public schools accountable for their special-needs kids, people who are disabled organizing to protect their rights and expand their opportunities, advocacy groups of any kind trying to legally and civilly enact political or social change, locally organized online spaces that strengthen local offline relationships and any special interest group including hobbyists, book clubs, travel clubs or sports fans.
These communities may be hosting their own spaces or using a platform offered by a large tech company to build genuine community. In each case, anyone might post material that another dislikes even if it violates no laws and rather than hold the individual who published the post accountable, an offended person may try to threaten a lawsuit against the host. That’s just the way people operate.
The mere threat of a lawsuit can be too much for some people trying to make their corner of the world a little better. Without the protection of Section 230, a well-meaning person or organization may lose their right to maintain a clean well-lighted space for civil discussion and capitulate to every demand regardless of its merit. An organization facilitating online community may decide to close their community altogether deciding that the risk is not worth the benefit. Who really loses when we threaten the opportunities to build meaningful communities that can have a positive impact on people’s lives?
[00:18:25] Patrick O’Keefe: Similar to Andrew, Scott highlights a big question here, who really benefits from reducing the protections of Section 230? In addition, Scott points to an interesting wrangle which is that many people build communities on larger platforms. When legislators write laws to hurt Facebook, what happens to community hosts who use Facebook groups? Small communities of vulnerable people sometimes use larger platforms and they should be accounted for.
When Michael Wood-Lewis of Front Porch Forum, a community well known in the state of Vermont was on the show, he talked about how hard it was to build locally-focused engaged communities in the age of Facebook. Here he talks about how their lack of appreciation for the responsibilities afforded by Section 230 could threaten the privileges that the law provides.
[00:19:12] Michael Wood-Lewis: Take a look at the intent of Section 230 and you’ll see that it lays out responsibilities for tech platforms, as well as privileges. But while most big tech players were quick to assert the privilege of freedom from liability for what their users post, they have been much slower to accept the responsibility of moderating user content. That kind of selective approach is not what the law intended and it’s a big part of the dark side of today’s internet.
From the start, Vermont’s Front Porch Forum has understood that a civil welcoming online community requires investment in thoughtful moderation. Our paid online community managers review every posting before publication. While they use a very light touch, their presence makes all the difference.
In a state with 260,000 households, 170,000 people are members of their local Front Porch Forum, and fully half post at least once a year. This remarkable level of engagement among neighbors does wonders for increasing local social capital and community resiliency. That’s what you get when a human scaled business embraces both the privileges and responsibilities of Section 230.
[00:20:34] Patrick O’Keefe: Michael speaks similarly here to the thoughtfulness that Gail Ann Williams referenced; the balance of privilege and responsibility creates an environment where Front Porch Forum and communities like it are possible. A regulatory environment that enforces some bizarre sense of political neutrality would not.
Finally, we have Angela Connor, who started in a community with a regional news outlet in 2007 before moving to the agency and consulting side.
[00:21:00] Angela Connor: When you care about your brand, you care about the content associated with it, which is why I’ve always felt that community managers are a key component to the success of any online community. There’s value in community guidelines and rules of engagement, and community managers and moderators who seek to uphold and encourage them and think first about the health of the online community.
If it’s to the thrive, there has to be some autonomy among the participants, who are indeed the reason for the community or forum or digital space in the first place. Those who provide that space deserve some protections. Section 230 allows those protections.
As an original community manager from the early days, let’s go back to 2007, building and growing a community to upwards of 14,000 members and someone who has been instrumental in guiding the direction and even the implementation of communities for other brands, I must say, that Section 230 of the Communications Decency Act is of high importance. It’s not something that people who actively participate in these conversations online, talk about, or even care much about, but we all benefit from it.
Those protections are not limitless. I don’t believe that Section 230 supports the “wild wild west” where there’s no accountability for content, but without it, we would be facing a new future for digital, online, and social communications. I think that it’s of great importance for people to read up on this and understand it, and really get why we need Section 230.
[00:22:37] Patrick O’Keefe: Angela is right when she says that those who provide community spaces deserve protections. That’s us. That’s community pros, hosts and facilitators of all ages. People who started as kids and teenagers like I did. People who manage small communities themselves as a one-person operation, possibly with volunteer moderators, and yes, community professionals at Fortune 500 companies, we all deserve this protection.
We’re not a massive group, but there may be more of us than there are coal miners, and Trump sure loves coal miners. While I think most of us would rather not use our voices and our communities to advocate politically, threats to Section 230 could push us to a point where they would threaten the very existence of our communities.
Thank you to everyone who shared their perspective for this episode. This includes in order of appearance, Bruce Ableson, director of evangelism and enablement at Adobe. Find Bruce on Twitter @bruceableson. Gail Ann Williams who consults on community conversation and craft beer at gailwilliams.com. Bassey Etim, editorial director at Canopy, canopy.cr. Follow Bassey on Twitter @BasseyE. Rosemary O’Neill, president of Social Strata, socialstrata.com. Follow Rosemary on Twitter @rhogroupee. Andrew Losowsky, head of Coral at Vox Media, coralproject.net. Find Andrew at losowsky.com. Scott Moore, community veteran with a focus on nonprofit, that you can follow on Twitter @scottmoore. Michael Wood-Lewis, co-founder and CEO of Front Porch Forum, frontporchforum.com and Angela Connor founder and CEO at Change Agent Communications, angela-connor.com and on Twitter @communitygirl.
For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and Carol Benovic-Bradley is our editorial lead.
Thank you for listening to Community Signal in 2019. I hope you use this holiday season to rest up and get ready because our communities will need us to fight for them in 2020.
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.