Steve Brock has been working in community for over 25 years, with a unique depth of experience in moderation for big brands. He has had to work with law enforcement many times, and on this episode, Mr. Brock shares stories from those efforts. Plus:
- What has remained consistent in his career through four company mergers
- Determining “valid need” with threats of self-harm
- The implication of Facebook’s patent application for a moderation tool
“We’ve gotten better at dealing with [trolling], we’ve gotten better at analyzing it, we’ve gotten better at studying it and finding out why it happens, but the technology that allows that to happen is lagging behind in its ability to help us, the people involved, take better action online.” -@Stevo4747
“[The police] are more than happy to look into [a situation] further if they determine that what’s happening is credible. We’ve been on both sides of that. We’ve saved lives in a schoolyard by sending police to an apartment where somebody was saying, ‘I am looking down on the schoolyard, and I have a gun.’ And it turned out that was true. At the same time, a person was saying, ‘I don’t want to be in this world anymore and I have a gun.’ And they were saved as well.” -@Stevo4747
About Steve Brock
Steve Brock is the director of moderation services at Mzinga, with a track record that includes high traffic and high visibility online communities from award-winning ad campaigns to top-rated prime-time television shows. He services not only clients but their customers, facilitating information exchange, brand, product and service awareness, customer service triage, and transactions.
Mr. Brock has been involved in community management for over 25 years. In 1991, he started the Usenet newsgroup rec.arts.books.reviews, before moving to the Microsoft Network portal in 1994 to manage their Reading forum, where he wrote content, managed 10 bulletin boards, ran a topical chat every evening and brought over 350 bestselling authors to the chat room for live interactive interviews.
In 1999, Mr. Brock joined WellEngaged, a division of The WELL. Through four major mergers and two name changes, he has managed virtually the same team of moderators and a few of Mzinga’s current clients have been with them the entire time.
- Facebook is Patenting a Tool That Could Help Automate Removal of Fake News by Casey Newton, about Facebook’s moderation tool patent
- Mzinga, where Mr. Brock is director of moderation services
- Wikipedia page for Usenet
- WellEngaged, LLC, Acquires Proxicom’s Community Suite
- Delphi Forums and WellEngaged Join to Launch Prospero Technologies
- Mzinga Closes $32M Funding, Buys Delphi Descendant Prospero
- When to Report Someone to Their ISP by Patrick
- Community Signal episode with Alex Embry, a law enforcement officer, where we discussed credible threats made online
- Community Signal episode with Matt Haughey, founder of MetaFilter, where we talked about the well-publicized fake suicide threat on their community
- A Member of Your Online Community Lies About Committing Suicide: What Do You Do? by Patrick
- Community Signal episode with Howard Rheingold, during which Mr. Rheingold wondered how Facebook could have so poorly integrated community tools
- onlinemoderation.com, Mzinga’s site dedicated to moderation-related content
- The University of Arizona’s National Institute for Civil Discourse, which Mr. Brock is involved with
- Mr. Brock on LinkedIn
- Mr. Brock on Twitter
00:04: You’re listening to Community Signal, the podcast for online community professionals. Tweet as you listen using #communitysignal. Here’s your host, Patrick O’Keefe.
00:20 Patrick O’Keefe: Hello, and thank you for listening to Community Signal. Things might be winding down for the holidays but the show doesn’t stop as we begin year two of the program with guest Steve Brock. We’ll be talking about how moderation companies interact with law enforcement and the impact of a new online community-related patent application from Facebook. Mr. Brock is the director of moderation services at Mzinga, with a track record that includes high-traffic and high-visibility online communities, from award-winning ad campaigns to top-rated prime time television shows. He services not only clients but their customers, facilitating information exchange, brand, and product, and service awareness, customer service triage, and transactions. Mr. Brock has been involved in community management for over 25 years. In 1991, he started the Usenet group, rec.arts.books.reviews, before moving to the Microsoft Network portal in 1994 to manage their reading forum, where he wrote content, managed 10 bulletin boards, ran a topical chat every evening and brought over 350 bestselling authors to the chat room for live, interactive interviews. In 1999, Mr. Brock joined WellEngaged, a division of The WELL. Through four major mergers and two name changes, he has managed, virtually, the same team of moderators and a few of Mzinga’s current clients have been with them the entire time. Mr. Brock, welcome to the program.
01:32 Steve Brock: Hello, glad to talk to everyone.
01:34 Patrick O’Keefe: Yeah. It’s great to have you on. And I find your career in the space really interesting, because you have remained at the same company through four different mergers. You were in charge of community management for WellEngaged, which according to a 1999 press release was “The leading business-to-business full service solution for building and maintaining online communities.” WellEngaged merged with Delphi forums and that company became Prospero technologies where you also oversaw community management. Prospero would later merge with KnowledgePlanet, and that’s how Mzinga was born. That’s a 17 year span in your career. What has remained consistent in that time?
02:12 Steve Brock: Unfortunately and fortunately for job security, trolling has remained the same. People wanting to put themselves on top of other people and cause disruption and make the community not a fine place to be that it should be.
02:29 Patrick O’Keefe: I hear that, as I talk to veterans in the space on this show, that a lot of the challenges that we talk about today are challenges that we talked about in the 90s. Would you view that as… Is that a healthy thing as more people enter the space, as more people discover these challenges? How has our progress been, I guess, in dealing with those challenges over the years? Have we gotten better at it?
02:49 Steve Brock: We’ve gotten better at dealing with it, we’ve gotten better at analyzing it, we’ve gotten better at studying it and finding out why it happens, but the technology that allows that to happen is lagging behind in its ability to help us, the people involved, take better action online.
03:10 Patrick O’Keefe: So it sounds like the greatest opportunity for improvement, in your opinion, is on the tech side and there are various companies out there that offer solutions that are helpful. Is there anything that you feel is missing in the market? Something that you would like to see created or something that there’s really a need for?
03:29 Steve Brock: Well, stepping back just a moment, going back to exactly what happens when somebody comes on, raises the roof, needs to be removed, we can ban them, we can ban their content. But if they come back, and if they start making threats, they need to be stopped as fast as possible, and it’s hard to do. Because then what you have to do is you have to get their IP number, hopefully it’s static instead of dynamic. If it’s dynamic, you have to go to their ISP, find out exactly when they were on, give them logs, find out who they are, and then it’s in the ISP’s court to take action as a follow up. We put it in their hands. And they, for the most part, don’t like being identified as somebody who won’t help us, so they do help us. But again, it’s time-consuming and meanwhile, this person’s going off in the community and we’re just having to take extra time to take their content down every time it comes up.
04:34 Patrick O’Keefe: I think that’s interesting because a lot of community professionals have never gone to an ISP. It’s something I’ve done in the past but if you don’t work for a moderation company or moderation isn’t your business, if you were just managing a community and moderation is one of your tasks, I find that… A lot of people just have never done that. That’s something you’re reserving for the worst of the worst. Right?
04:56 Steve Brock: Yes.
04:57 Patrick O’Keefe: Because it’s a time-consuming effort to get the ISP’s attention, to send it over and then hope they take action.
05:01 Steve Brock: Yeah. And once you do have the IP address, you can use that and go to… You can find out where they’re located, and actually call the local police department and report it to their cyber crimes unit, if necessary. And I’ve had to do that before.
05:19 Patrick O’Keefe: What type of crimes were you reporting when you called those police departments? And, I guess, what sort of response did you receive?
05:25 Steve Brock: They were personal threats of a credible nature. “I know where you live, I’m coming over,” that kind of thing. The cyber crimes units, again, they differ by jurisdiction. For the most part, they’ve been very helpful and we’ve actually had people go to other locations and take action against that person.
05:48 Patrick O’Keefe: It’s interesting because I had a police officer on the show once and we talked about what’s actionable? What’s not? How to make it easier for a local police department to act. When you call them, what sort of information are you really drilling down on to help them feel like they can actually do something about the situation?
06:06 Steve Brock: Well, it’s a knife that cuts both ways because they wanna stop a crime but they also wanna help a person in need. So there are people that are making threats against others but there are also people who are threatening to harm themselves. And so they are more than happy to look into it further if they determine that what’s happening is credible. We’ve been on both sides of that, we’ve saved lives in a schoolyard by sending police to an apartment where somebody was saying, “I am looking down on the schoolyard, and I have a gun.” And it turned out that was true. And at the same time, a person was saying, “I don’t wanna be in this world anymore and I have a gun.” And they were saved as well.
06:51 Patrick O’Keefe: Stories like that are amazing to me. When I had the gentleman on before, he’s one of my moderators on a community I manage, and he happens to be a SWAT team commander outside of Chicago, training sergeant at that department, and he told a story of how, based upon a tip from someone’s Facebook friends, they were able to go and then cut someone down, as they were hanging themselves, and saved their life. But the person’s roommates had no idea. The person’s roommates had no idea what was happening, they would have been surprised to find their roommate, the next morning, or the next day and it was really based upon online tips that they were able to help that person.
07:26 Steve Brock: Yeah. When you close the door to where your computer is, you have your own cocoon where the real world doesn’t notice what’s going on but the virtual world is listening to you.
07:37 Patrick O’Keefe: When you are making that complaint to the police department, you mention the IP addresses as you alluded to, those can be sketchy sometimes, tough to determine. With the communities you manage, do you often have extra information about the person, their name or the city where they live or some other information that allows you to key in and direct or help direct those law enforcement resources?
07:57 Steve Brock: Yeah, we do. We have the IP being used at the time of the post, we have a copy of the content, and we have a log of exactly when that went up, down to the second. And that’s all an ISP needs to do the trace.
08:14 Patrick O’Keefe: This is interesting to me because I don’t think I have ever talked to anybody about an ISP trace before. Is it the police department asking for the trace to locate that person?
08:23 Steve Brock: The police department still has to go to the ISP as well.
08:26 Patrick O’Keefe: Interesting. Sticking with that theme, we’re talking about law enforcement and you mentioned before the show that you’ve worked with the FBI, who then apprehended users in the spaces you were managing that, as you mentioned, wanting to harm others. It’s an area where a lot of people don’t have experience, and I’d like to talk about the process of dealing with that, and how serious it’s taken. How to be taken seriously as a community manager. Because there are things online. You can submit a tip right to the cyber crimes division but I have to believe they get so many submissions, so many things submitted to them. If you’re a community professional and you feel like you have a serious credible threat, how do you make it so that your claim is taken seriously?
09:07 Steve Brock: It’s always in conjunction with a client that I’m working with. And so when we have a contract with that client, we make sure that we have a contact at that client that has expertise in escalating threats, and we make sure that there is someone available 24/7, as we are.
09:26 Patrick O’Keefe: That’s interesting. So you’re working with large clients, credible brands, and they have, I’m just guessing, your general counsel or policy advisors who have those connections. And so when you escalate that threat to them they take it over and have that connection already to help make sure that it receives the proper attention.
09:43 Steve Brock: That’s correct and, unfortunately, some of them we work with quite often.
09:47 Patrick O’Keefe: Have you noticed any trends as far as the types of communities that are more likely to attract that sort of attention? Not specific ones or specific brands but just, in audiences, are there more higher-risk audiences that you’ve dealt with where, say, threats of violence are more common or threats of self-harm are more common?
10:04 Steve Brock: I would say news, sports, finance and entertainment. Those are the big four.
10:11 Patrick O’Keefe: I find finance to be the interesting one there.
10:14 Steve Brock: Well, finance hits peoples lives and if there’s something that happens with a company and they don’t have a good relationship with them, they will go to that company’s site and make threats.
10:30 Patrick O’Keefe: It’s interesting. Yeah, I see news comments out, obviously, get a very specific bad reputation, some of it deservedly so. Entertainment seems logical but finance it’s interesting. But, like you said, finance is such a varied space. I guess that could be a financial company like you talked about, it could be like… Let’s say someone lost their retirement.
10:49 Steve Brock: That is correct.
10:50 Patrick O’Keefe: That could drive someone to do something drastic or to feel like life’s not worth living any longer. So it’s interesting to think about those more business-minded topics and the fact that people within those communities are just as vulnerable. Someone might mention a teenage community, a community of teenagers as somewhere where you might receive a lot of threats. But financial sites, financial companies are dealing with, in many cases, or most cases, adults. Those audiences can be, I guess in some cases, from what you’re saying, just as vulnerable as what we might think about.
11:22 Steve Brock: That’s correct. And at the same time, some of the work that we do with our clients, as moderators, we are in the reputation management and reputation resurrection space. So we’re trying to help people feel better about that brand.
11:38 Patrick O’Keefe: And going back to saving lives, another thing you told me before the show is that you are most proud of the actions that your companies, your teams have taken to save lives. You were saying we have been able to get help to those who express a “valid need” for it as well as stopping those intent on harming others. “Valid need” is a really interesting term because we deal with words on a screen, so many words, so many posts, so many comments. I had the founder of MetaFilter on a couple weeks ago and they had a really highly-publicized fake suicide on their community, where someone gave them a great story and told them this thing and they took it seriously, and it did have some credibility, and then the person was just playing a game. When it comes to recognizing valid need, do you have any tips for that?
12:20 Steve Brock: The biggest tip that I can give is to treat every need as a valid one. You are not the only judge of the need. You’re, in many cases, the first judge, but a lot of times, if you have a mechanism where users are reporting or able to report what they see to a space, a queue that is monitored, you can find out something is happening. You can go over and take charge of it. But the biggest tip that I can say is to be as flexible as you can and don’t rule something out that could be credible. Treat just about everything as if it is credible, and go ahead and make the escalation. It goes through other filters of people who have their own mandates and decision-making capabilities. So once you report it, they get the information from you, they take it, and take action. And if they end up at somebody’s house and find out that it has not been credible, there’s things that they have to do to make that happen too. You could be charged with a crime.
13:35 Patrick O’Keefe: It’s interesting because there’s different worlds of community. In your case, you have brands that you escalate it to and then they can take it from there. A lot of community professionals at different companies have to handle it internally or for themselves, so it is such a hard topic. It’s such a challenging topic, but I think, what I try to tell people, and it’s similar to what you’re saying here, is to take it seriously, to do the best you can, and handle it as best you can with the information you have at hand. Really, try your best not to take it too personally. It’s hard. And obviously, you know this from managing moderators for long as you have, but it’s hard sometimes for moderators to divest themselves of certain high-leverage, highly-emotional situations. Being a moderator’s a hard job. You manage around 50 multilingual moderators from around the world, and as I mentioned earlier, you’ve been able to keep a lot of the same people for a very long time. Who’s been with you the longest?
14:29 Steve Brock: I actually have a couple of volunteers that worked with me at Microsoft Network, through the moderators that were there were volunteers. We didn’t pay any of the moderators that we had on at that time. And when I started at WellEngaged, I was able to bring them in. Again, this was a book-related Microsoft forum and there just so happened to be a book-related client. And so, I contacted the two moderators and was very happy to start paying them to moderate for me. And they’ve been with me ever since.
15:05 Patrick O’Keefe: So that’s around 20 years?
15:07 Steve Brock: Yeah.
15:07 Patrick O’Keefe: Why are they still in the role? Why do people like moderating? Why do people stick around, certain people stick around in that role for so long, do you think?
15:14 Steve Brock: They like dealing with people, they like being online, and they like thinking that they are helping people become better interactive participants. That’s one thing that I stress to every one of my moderators is to make sure that they feel that they are making their spaces a better place and a more welcome place to interact.
15:40 Patrick O’Keefe: I like to say that, as I manage communities, as I moderate communities, it’s my job to look at the bad stuff. I look at the bad stuff so that the members of the community don’t have to. So moderators have to see a lot of the bad stuff. Do you do anything or what do you do to make sure that they’re taking care of themselves, that self-care is a priority, that they’re not getting burned out from having to deal with what is, in many cases, the worst part of many online communities.
16:04 Steve Brock: All of my moderators are contractors and so I tell them that we have the ability to swap out hours. If somebody needs to take extra time, they can get somebody to work the shift for them on that account. It’s up to them to let me know how they’re doing, and I can pretty much tell. I do recommend that they take time to take care of themselves. Another thing is that people who interact a lot need to take care of themselves as well. There’s people that get banned on certain communities over and over, and I actually reach out to them on a one-on-one basis and talk to them about how to become a better… I call them Internet citizens… How to interact better so they’re not getting into arguments that escalate and get out of hand. And just how to be more responsive to the need of the community.
17:03 Patrick O’Keefe: Those conversations can be somewhat time-consuming, I think. I don’t know what the success rate is. I don’t think it’s high, but when they work out, it’s pretty rewarding, isn’t it?
17:12 Steve Brock: Very much. One of the things that I tell other people is that some of the worst trolls have the potential to become some of the best online citizens, and they have been.
17:24 Patrick O’Keefe: Casey Newton at The Verge reported that Facebook has applied for a patent for “systems and methods to identify objectionable content.” Newton focuses mostly on it through the lens of the fake news issue but it’s clear, from reading the article and taking a look at the patent, that it’s really a moderation tool, and one that takes reports from users and passes them through machine-learning to score content and determine what should then be removed. To do this, it relies on various signals, such as weighting reports based upon the reputation of the reporting user. When I read the story, my first thought was it sounds like Facebook is trying to patent strategies that community professionals have already been using. I’m curious to hear what you thought.
18:08 Steve Brock: Well, there’s a couple of thoughts. Number one, they want their name on something that helps, again, their reputation for community management and management of content that doesn’t offend others, which any other large software company wants as well. But if you drill down even further, it says in the article that we were both talking about, Facebook has a pattern of applying for these types of patents and getting the news out that they’re doing it whether they’re really intending on doing it or not. They just want the word to get out that they’re doing it. They may or may not act on it.
18:50 Patrick O’Keefe: I’m seeing a lot tech-savvy people on Twitter, Facebook, etcetera talk about this and say, “This isn’t something that should be patented. This is something that should be just out in the public domain.” I see harm in this patent if it goes through and if they use it, and if they hold other people to… Which is always the danger, because it doesn’t sound like anything inherently new. I’ve talked to different people who said, “Yeah, x was writing something like that back in the 90s.” Are you concerned by that sort of technology being patented?
19:17 Steve Brock: No, because as far as my concept of patent process goes, it’s just Facebook putting their name on its own version of what it can do. It’s not putting it on the entire aspect of automated moderation as far as I’m aware at this point. And so, I’m fine with them doing it and putting their name on their version. That’s not to say that every other version that comes out that’s a little bit different is a patent violation.
19:51 Patrick O’Keefe: And it sounds like, from reading the Mzinga website and from talking to you, that you do some work on Facebook. And I’ve talked to different people and I use Facebook myself, obviously, and one area where they could certainly use some help is, in my opinion, is the moderation tools.
20:06 Steve Brock: I agree. That’s for sure. Yeah, we do a lot of work on Facebook and a lot more work is headed to there as well.
20:14 Patrick O’Keefe: I had Howard Rheingold on the show a while back and he said that he spent some time talking to some people at Facebook about this and he couldn’t believe that, as smart as they are, that they have added forums and, I guess, the management of forums too in such a haphazard or non-modern way. And it’s funny. I haven’t talked to anyone. I know I talk to a lot of people who manage brand presence on Facebook and manage communities by Facebook. I haven’t talked to a single person who said, “You know what? Those Facebook tools, they’re amazing. I love those tools. They have such great tools.” I haven’t talked to one person.
20:48 Steve Brock: I have heard the same.
20:50 Patrick O’Keefe: So Facebook has the great benefit of being the social platform of our time, has the people, but when it comes to tool sets for, what is an, admittedly, small group of people, I guess, in their world, community professionals, people managing pages, but an important group nonetheless. They just haven’t made those tools as good. They haven’t made those tools… Forget industry-leading, they haven’t made those tools even, I would say, average for the community space when it comes to moderation and management.
21:16 Steve Brock: Well, that will be a great tagline for them. It doesn’t work well but, boy, we got the patent for it.
21:24 Patrick O’Keefe: Well, Mr. Brock, it has been a pleasure to have you on the show. Thank you for joining me.
21:28 Steve Brock: It’s been a pleasure as well.
21:30 Patrick O’Keefe: We have been talking with Steve Brock, the director of moderation Services at Mzinga. Visit them at mzinga.com and onlinemoderation.com. He is also involved with the University of Arizona’s National Institute for Civil Discourse. Their website is nicd.arizona.edu. Finally, you can find Mr. Brock online on LinkedIn at linkedin.com/in/brocksteve and Twitter @stevo4747, that’s S-T-E-V-O-4-7-4-7. For the transcript on this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad. We’ll be back next week.
Thank you for listening to Community Signal.