Building a Database of CSAM for AOL, One Image at a Time

If you work in content moderation or with a team that specializes in content moderation, then you know that the fight against child sexual abuse material (CSAM) is a challenging one. The New York Times reported that in 2018, technology companies reported a record 45 million online photos and videos of child sexual abuse. Ralph Spencer, our guest for this episode, has been working to make online spaces safer and combatting CSAM for more than 20 years, including as a technical investigator at AOL.

Ralph describes how when he first started at AOL, in the mid-’90s, the work of finding and reviewing CSAM was largely manual. His team depended on community reports and all of the content was manually reviewed. Eventually, this manual review led to the creation of AOL’s Image Detection Filtering Process (IDFP), which reduced the need to manually review the actual content of CSAM. Working with the National Center for Missing and Exploited Children (NCMEC), law enforcement, and a coalition of other companies, Ralph shares how he saw his own team’s work evolve, what he considered his own metrics of success when it comes to this work, and the challenges that he sees for today’s platforms.

The tools, vocabulary, and affordances for professionals working to make the internet safer have all improved greatly, but in this episode, Patrick and Ralph discuss the areas that need continued improvement. They discuss Section 230 and what considerations should be made if it were to be amended. Ralph explains that when he worked at AOL, the service surpassed six million users. As of last year, Facebook had 2.8 billion monthly active users. With a user base that large and a monopoly on how many people communicate, what will the future hold for how children, workers, and others that use them are kept safe on such platforms?

Ralph and Patrick also discuss:

  • Ralph’s history fighting CSAM at AOL, both manually and with detection tools
  • Apple’s announcement to scan iCloud photos for NCMEC database matches
  • How Ralph and other professionals dealing with CSAM protect their own health and well-being
  • Why Facebook is calling for new or revised internet laws to govern its own platform
Continue reading “Building a Database of CSAM for AOL, One Image at a Time”

Shifting Revel, a Community for Women Over 40, from In-Person to Online Overnight

As community practitioners, we often serve communities that we don’t necessarily belong to. But how would you approach designing a community platform, events, and policies for a demographic that you don’t belong to? Alexa Wahr, the COO of Revel, a community for women over 40, says that she and her co-founder build by putting their community first. “We absolutely listen to our members. We don’t try to pretend like we know what exactly our members are going through or what it’s like to be a woman in their life. That doesn’t mean that we can’t help to build the community and build the tools that help them connect.”

In this episode of Community Signal, Alexa shares how the policies that govern the platform, Revel’s approach to safety during the pandemic, and Revel’s acquisition of The Woolfer, are all grounded in putting their members’ needs, safety, and experiences first.

Alexa also discusses how Revel, an in-person events-based community, shifted entirely to virtual events in light of the pandemic. Through this model, Revel members have continued to have meaningful interactions, build friendships, and support one another through COVID-19.

Alexa and Patrick also discuss:

  • How Revel is encouraging their event hosts to stay safe now that in-person events have resumed
  • Revel’s plans to introduce paid events into their community
  • The differences between the Revel and Woolfer communities and how they’re balancing the needs for both
Continue reading “Shifting Revel, a Community for Women Over 40, from In-Person to Online Overnight”

How Teleheath Provides More Efficient Healthcare for Patients and Providers – and the Role Online Communities Can Play

How did the pandemic impact your relationships with your healthcare providers? Did telehealth enable you to continue seeing or connecting with your providers to receive the care that you needed?

In this episode of Community Signal, Denzil Coleman, a telehealth coordinator, developing and maintaining digital health interventions at the Medical University of South Carolina (MUSC) Center for Telehealth, discusses how the adoption of telehealth interactions and practices during the pandemic may lead to continued and more long-term improvements and efficiencies in our healthcare system.

Denzil explains that telehealth is “anything where healthcare is being impacted by a patient and an actor that are not in the same location. That includes a video, that includes transmissions of information, asynchronous messaging, [and] remote patient monitoring.” Telehealth can create efficiencies for both patients and providers –– giving patients flexibility to see their providers without the burden of travel and with the option to invite more caregivers into these interactions.

Whereas in the past, patients may have received pamphlets with details about in-person support groups or other care options, today there are online communities and support groups and insurance companies themselves even offer telehealth options. With these options come more opportunities for patients to be more engaged in the care that they receive and for providers to thoughtfully care for patients.

Denzil and Patrick also discuss how:

  • COVID, the shifting landscape of the healthcare profession, and the fact that folks are living longer, healthier lives all impacts the healthcare system
  • The flexibility of telehealth allows a patient’s support system to become more involved in their care
  • Creating efficiencies in the healthcare system should not equate to patients receiving less care
  • Value-based care could resemble a community-like investment in overall care
Continue reading “How Teleheath Provides More Efficient Healthcare for Patients and Providers – and the Role Online Communities Can Play”

While Making a Mixtape, Asher Roth Built an Online Community

Photo: Drew Dennis

In between his three albums, rapper Asher Roth has released several mixtapes, including 2011’s Pabst & Jazz and his The Greenhouse Effect series. The third entry in that series, The Greenhouse Effect Vol. 3, hit streaming services on September 3, 2021.

But there’s something about his latest mixtape that makes it unique from every album, EP, and mixtape he’s released so far: It was a collaboration with his online community of fans and supporters.

As Asher contemplated making music during the COVID-19 pandemic, he came up with an idea: What if The Greenhouse Effect Vol. 3 was “entirely produced by fan/friend/follower submissions?” He set up a Discord, and off they went. He’d post acapellas – audio clips of only his vocals – and community members would produce song submissions, which Asher would review live on Twitch. The project would adopt a narrative story, adding guest verses from the community, too.

With the mixtape out, Asher stops by to talk about the collaborative process behind the release, the tools he used, and the community building lessons he learned along the way. One of the great things about this story is that the creation of this mixtape has helped birth an active online community, which Asher hopes will foster further collaborations between members.

Asher and Patrick also discuss:

  • How guardrails help encourage sustained creativity
  • Why Discord?
  • Now that it has achieved its first big goal, what’s next for the community?
Continue reading “While Making a Mixtape, Asher Roth Built an Online Community”

Here’s How Anti-Vaxxers Are Spreading Misinformation Despite Your Best Moderation Efforts

What moderation tactics have you used or seen as a mechanism to curtail the spread of misinformation in communities and on social media platforms? Word detection, link blocking, and digital stickers promoting legitimate information sources may immediately come to mind.

But what would happen if you ran your moderation tools against URLs shared in link-in-bio services used in your community? Or what if you learned that folks on your platform were using specific codewords to circumvent word detection? Or posting screenshots of misinformation rather than using plain-text? People are getting creative with how they share all types of information online, misinformation included. Are our moderation strategies keeping up?

In this discussion, Patrick chats with Joseph Schafer, an undergraduate student of Computer Science and Ethics at the University of Washington and Rachel Moran, a postdoctoral fellow at the University of Washington’s Center for an Informed Public. They discuss their research and how anti-vaccine advocates are circumventing content moderation efforts on Facebook, Instagram, Twitter, and large social networks. Some of their findings might surprise you! For example, specific folk theories have emerged that define how some believe social platforms and algorithms work to moderate their content and conversations. And whether these theories are true or not, the strategies forming around them do seem to help people keep questionable content up long enough for researchers to come across it.

So, where do we start? How can we detect misinformation if people are using codewords like pizza or Moana to get around our tools and teams? There may not be precise solutions here just yet, but Rachel and Joseph both offer ideas to help us down the right path, which starts with deciding that the engagement that brews around misinformation is not safe for the long-term health of your community.

Among our topics:

  • Why Linktree needs community guidelines and how link-in-bio sites have become a vector for misinformation
  • The folk theories that are informing how we perceive and operate around social media algorithms
  • Adapting your moderation strategies to better find misinformation
Continue reading “Here’s How Anti-Vaxxers Are Spreading Misinformation Despite Your Best Moderation Efforts”

Fostering Resiliency for Community, Moderation, Trust, and Safety Pros

When was the last time you mandated that your community, moderation, trust, and safety colleagues schedule time for out of queue activities? When was the last time you led by example and took a break or participated in other wellness activities before you felt burnout? What was the last tool your product team built to help foster resiliency for your moderators?

While we can’t mitigate all burnout, in this episode, Patrick and our guest, Adelin Cai, discuss how employee resiliency programs and policies can help you create an all-around safer environment for your colleagues and teams. Tools like well-defined queues and changing the presentation of harmful content are also potential product solutions that can foster resiliency from a workflow perspective.

With experience in policy, trust, and safety leadership for Pinterest, Twitter, and Google, Adelin also shares her approach for thinking about the metrics that matter. Spoiler: Metrics that revolve around quantity, like number of cases closed, or even quality, like CSAT, may not always equate to success or reflect the health of your community. Adelin also discusses working collaboratively with product and engineering teams to ensure that there’s transparency about what is being built and launched and what community behaviors or metrics should be monitored to indicate performance and to influence the further direction of the product.

Among our other topics:

  • The baseline for an employee resilience program
  • What an ideal work relationship with product and engineering looks like
  • How to reallocate resources and budget to prioritize essential moderation, trust, and safety work
Continue reading “Fostering Resiliency for Community, Moderation, Trust, and Safety Pros”

What Makes an Online Community a Home?

May 21st, 2021 marked 20 years since the launch of KarateForums.com. In this episode of Community Signal, Patrick speaks with five forum members that have been on KarateForums.com for nearly 65 years, collectively. Together, they discuss what keeps them coming back to the community as members, moderators, and martial artists. 

While each member brings different experiences and background to the community, Bob, Brian, Danielle, Devin, and Noah all cite the quality of the interactions that they’ve had in the community and how it has brought out their skills as community members, teachers, and students of the martial arts. Those interactions helped these folks launch their own martial arts schools, grow as martial artists, and pay it forward to hundreds of thousands of other folks seeking out knowledge.

Whether you’re listening to this episode with 20 years of community management experience or you’re working on approaching that milestone, a few things emerge as truths from this episode –– that it’s not the size of a community that matters, but the level of care that you find there. That community members can go from the verge of being banned to becoming model community members, if given the chance. That communities thrive when they help their members achieve their goals and pay it forward to others. Whether this is your first year as a community manager or your twentieth, we hope that you find these lessons and stories helpful. And here’s to another 20 years of KarateForums.com! 

They also discuss:

  • The benefits of your members joining other communities
  • How KarateForums.com helped each guest find confidence, friends, and more
  • Why Devin describes KarateForums.com as charitable
Continue reading “What Makes an Online Community a Home?”

Dismantling the Model Minority Myth and Fostering Safer Communities, One Conversation at a Time

For this episode of Community Signal, we’re joined by community professionals Jenn Hudnet, Lana Lee, and Phoebe Venkat. They candidly share stories about the impact of racism and stereotypes against Asians, Asian Americans, and Pacific Islanders in their own lives, in the workplace, and in the communities they manage.

Jenn, Lana, and Phoebe each had stories to share about their families, the circumstances that brought them to the United States, the racism and discrimination they faced, and the shared generational trauma they’re working through together. “We have to look forward. We’ve got to acknowledge some of the wrongs that happened to our parents, relatives, and friends in the past. It’s very difficult to do. We’re doing it, but it definitely takes a community of community to get that done,” shared Phoebe (7:47).

There’s also a discussion around the work that companies and colleagues must do to maintain safe workplaces and communities. “Your intention might not always be to hurt or harm someone or to make fun of someone, but the impact is still there. Being able to understand the impact that our words and actions have on others is important [as well as] being able to acknowledge the impact that it might have on somebody. I think microaggressions are something that I’ve even had to learn to recognize because I’ve just internalized them and accepted them over the years of being here,” said Jenn (21:12).

And there’s an important reminder in this episode to see your colleagues and community members as individuals. Individuals that might have a bad day, that might make mistakes, or that might be comforted just by your presence. “Sometimes we hear stories of people. [Maybe] they posted a really good picture one day and then the next day they’re feeling down. … As a community manager, [it’s really important to] take time to read and understand where people are coming from,” explains Lana (49:46).

We’re thankful to Jenn, Lana, and Phoebe for sharing with us. May this conversation lead to safer communities, neighborhoods, workplaces, and personal boundaries.

Lana, Jenn, Patrick, and Phoebe also discuss:

  • The model minority myth and the harm it causes
  • Recognizing emotional labor and setting boundaries
  • There are no growth hacks when it comes to helping your community members feel safe
Continue reading “Dismantling the Model Minority Myth and Fostering Safer Communities, One Conversation at a Time”

Helping Online Community Members Experiencing a Mental Health Crisis

Crisis Text Line offers free, 24/7 support via text message to anyone facing a mental health crisis. Some organizations partner with Crisis Text Line to develop co-branded text lines for their community, but starting today, you can make Crisis Text Line part of your policy and response strategy if anyone in your community or on your team shares or shows signs that they’re experiencing a mental health crisis.

The other part of your response strategy leverages a skill that you likely practice everyday –– empathy. Becka Ross, the chief program officer at Crisis Text Line, reminds us that “anybody can be empathetic. When somebody is expressing or showing signs of mental illness, it’s not the expectation that somebody steps up into a role of a psychotherapist or a doctor or any other mental health professional, but all humans can be empathetic to one another.”

Crisis Text Line is powered by a team of 39,000 volunteers. Their community, training, and volunteer opportunities call on people from all walks of life to work together to help those facing mental health dilemmas. In our discussion with Becka, you’ll learn not only how the team supports one another through community, but also how you can do the same for your own community members and the people you care about.

Becka and Patrick discuss:

  • How Crisis Text Line partners with organizations and offers itself as a resource to anyone in need
  • Forming a mental health crisis policy for your community
  • Using machine learning to respond quickest to those most at risk
Continue reading “Helping Online Community Members Experiencing a Mental Health Crisis”

Whistleblower: Facebook is Allowing Dictators to Mislead Their Citizens

Last month, Sophie Zhang, a former data scientist at Facebook, went public as a whistleblower drawing attention to how the company delayed action against or outright ignored manipulation of it’s platform by autocratic leaders and global governments to the detriment of the people of those countries.

All work, including community management, requires trade-offs, areas of focus, and prioritization. Our teams and resources allow us to increase our areas of focus and more consistently foster the interactions that our communities exist for. But for an organization with the staff and resources of Facebook, you’d expect the trade-offs to be few and far between, and the areas of focus to be vast – covering the areas of the platform prone to abuse just as much as areas that foster healthy interactions.

But for Facebook, Sophie describes how, at least internally, those lines between healthy interactions and “inauthentic interactions” surfaced potential conflicts of interest, slowness to take action, and a tendency to focus on some countries more than others.

When we’re prioritizing what to work on or how to foster our communities, we may reference company values or internal OKRs. But for community professionals, there’s also the question of how does this preserve the safety of the community and those in it? How is Facebook scaling to protect the political safety of its members? Or perhaps a better question is, does it even think it has the responsibility to do so? As Sophie says, “it’s important to remember that, at the end of the day, Facebook is a company. Its goal is to make money. It’s not focused on saving the world or fixing the product. I think it’s important to be cynically realistic about the matter.”

Sophie and Patrick discuss:

  • Manipulation so brazen that the government actors didn’t even bother to hide it
  • The real-world implications that “inauthentic behavior” on Facebook has had for Azerbaijan, Honduras, India, and other countries
  • How Facebook differentiates and actions inauthentic profiles and pages
Continue reading “Whistleblower: Facebook is Allowing Dictators to Mislead Their Citizens”