Building a Database of CSAM for AOL, One Image at a Time

If you work in content moderation or with a team that specializes in content moderation, then you know that the fight against child sexual abuse material (CSAM) is a challenging one. The New York Times reported that in 2018, technology companies reported a record 45 million online photos and videos of child sexual abuse. Ralph Spencer, our guest for this episode, has been working to make online spaces safer and combatting CSAM for more than 20 years, including as a technical investigator at AOL.

Ralph describes how when he first started at AOL, in the mid-’90s, the work of finding and reviewing CSAM was largely manual. His team depended on community reports and all of the content was manually reviewed. Eventually, this manual review led to the creation of AOL’s Image Detection Filtering Process (IDFP), which reduced the need to manually review the actual content of CSAM. Working with the National Center for Missing and Exploited Children (NCMEC), law enforcement, and a coalition of other companies, Ralph shares how he saw his own team’s work evolve, what he considered his own metrics of success when it comes to this work, and the challenges that he sees for today’s platforms.

The tools, vocabulary, and affordances for professionals working to make the internet safer have all improved greatly, but in this episode, Patrick and Ralph discuss the areas that need continued improvement. They discuss Section 230 and what considerations should be made if it were to be amended. Ralph explains that when he worked at AOL, the service surpassed six million users. As of last year, Facebook had 2.8 billion monthly active users. With a user base that large and a monopoly on how many people communicate, what will the future hold for how children, workers, and others that use them are kept safe on such platforms?

Ralph and Patrick also discuss:

  • Ralph’s history fighting CSAM at AOL, both manually and with detection tools
  • Apple’s announcement to scan iCloud photos for NCMEC database matches
  • How Ralph and other professionals dealing with CSAM protect their own health and well-being
  • Why Facebook is calling for new or revised internet laws to govern its own platform

Our Podcast is Made Possible By…

If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.

Big Quotes

How Ralph fell into trust and safety work (20:23): “[Living in the same apartment building as a little girl who was abused] was a motivational factor [in doing trust and safety work]. I felt it was a situation where, while I did basically all I could in that situation, I [also] didn’t do enough. When this [job] came along … I saw it as an opportunity. If I couldn’t make the situation that I was dealing with in real life correct, then maybe I can do something to make a situation for one of these kids in these [CSAM] pictures a little bit better.” –Ralph Spencer

Coping with having to routinely view CSAM (21:07): “I developed a way of dealing with [having to view CSAM]. I’d leave work and try not to think about it. When we were still doing this as a team … everybody at AOL generally got 45 minutes to an hour for lunch. We’d take two-hour lunches, go out, walk around. We did team days before people really started doing them. We went downtown in DC one day and went to the art gallery. The logic for that was like, you see ugly stuff every day, let’s go look at some stuff that has cultural value or has some beauty to it, and we’ll stop and have lunch at a nice restaurant.” –Ralph Spencer

How organizations work with NCMEC and law enforcement to report CSAM (28:32): “[When our filtering tech] catches something that it sees in the [CSAM] database, it packages a report which includes the image, the email that the image was attached to, and a very small amount of identifying information. The report is then automatically sent to [the National Center for Missing and Exploited Children]. NCMEC looks at it, decides if it’s something that they can run with, and if it is … they send the report to law enforcement in [the correct] jurisdiction.” –Ralph Spencer

When “Ralph caught a fed” (37:37): “We caught the guy who was running the Miami office of [Immigration and Customs Enforcement]. He was sending [CSAM]. … That one set me back a little bit. … I remember asking the guy who started the team that I was on, who went on to become an expert witness. He worked in the legal department, and his job basically was to go around the country and testify at all the trials explaining how the technology that caught these images worked. I said, ‘I got an email about this guy from ICE down in Florida, was that us?’ He’s like, ‘Yes, that was you.'” –Ralph Spencer

Facebook’s multiple lines of communication offer multiple avenues for content violations (45:08): “Zuckerberg is running around talking about how he’s trying to get the world closer together by communicating and increasing the lines of communication. A lot of these lines just lead to destructive ends.” –Ralph Spencer

About Ralph Spencer

Ralph Spencer has been working to make online spaces safer for more than 20 years, starting with his time as a club editorial specialist (message board editor) at Prodigy and then graduating to America Online. He’s wrestled with some of the most challenging material on the internet.

During his time at AOL, Ralph was a terms of service representative, a graphic analyst, and a case investigator before landing his final position as a technical investigator. In that position, he was in charge of dealing with all issues involving child sexual abuse material (CSAM), then referred to as “illegal images” by the company. Ralph oversaw the daily operation of the automated processes used to scan AOL member email for these images and the reporting of these incidents to the National Center for Missing and Exploited Children (NCMEC) which, ultimately, sent these reports to the appropriate law enforcement agencies.

The evidence that Ralph, and the team he worked with in AOL’s legal department, compiled contributed to numerous arrests and convictions of individuals for the possession and distribution of CSAM. He currently lives in the Washington, DC area and works as a freelance trust and safety consultant.

Related Links

Transcript

Your Thoughts

If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.

Leave a Reply

Your email address will not be published. Required fields are marked *