Director Talia Stroud, associate professor of communication studies at The University of Texas at Austin, joins me on this episode to discuss the obstacles that can prevent comment sections from being great, and offer straightforward recommendations for how you can make them better. Our topics include:
- Inspiring thoughtful discourse when polarizing conversation leads to more buzz
- Talia’s thoughts on traditional media sites removing their comment sections
- Why you should add a respect button to your community content right now
Our Podcast is Made Possible By…
If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Higher Logic.
“The Engaging News Project holds a series of workshops where we invite digital news leaders to come together to brainstorm solutions to some of the problems plaguing the news media industry. An idea that emerged: Wouldn’t it be interesting if, after you left a comment, before the comment was finalized or maybe even near the comment section, there was a picture of the person or a video of the person that was in charge of the moderation, saying, ‘Hi, I’m such and such, and my job is to moderate these comments.’ Because I think it would put a human face behind what this job really is.” -@TaliaStroud
“What a lot of news media organizations think of comments, is the fact that they threw up a box and did nothing. And then that’s the ROI of comments, is whatever came from that box with no strategy. It’s not even that they have comments or not, it’s that they never even gave it fair shake.” -@patrickokeefe
“I understand why a news media organization may [remove comments]. I do think that it is too bad, because we see all of these instances in which other entities are encroaching upon what the news media used to do. … As news organizations start to allow more entities to siphon off part of their business model, it makes me concerned for the news media. In shutting down comments and allowing all of that to happen on a social media platform, a news organization is again kind of giving up that part that used to attract at least some portion of people to their site. And so I think that that’s unfortunate, that they’re not getting that data, getting that relationship with people. I think it’s potentially a dangerous thing to keep doing longterm.” -@TaliaStroud
“[Rather than removing their comments section,] I think that organizations would be better served, if they don’t like what’s happening in the comments space, to think of ways to improve upon it, if they’re able to or to think of another idea, and generate some out of the box idea for engaging people on the site.” -@TaliaStroud
“We worked with a prominent political reporter, who would go into the comment section and answer factual questions or ask people questions that were of interest to him or complement people on a strong comment. … When the reporter went in, we saw the amount of incivility decrease by about 15%, and we saw people provide more evidence for their comments. About 15% of people were providing more evidence for their comments.” -@TaliaStroud
“In our research, what we found is that people were more likely to click on partisan comments that didn’t agree with them when they had a ‘respect’ button there, compared to a ‘like’ button. A Democrat would be willing to respect a Republican comment, even if they didn’t like it.” -@TaliaStroud
About Talia Stroud
Dr. Natalie (Talia) Jomini Stroud is an associate professor of communication studies at the University of Texas at Austin, assistant director of research at the Annette Strauss Institute for Civic Life and director of the Engaging News Project, a research-based organization that examines commercially viable and democratically beneficial ways of improving online news. Stroud is interested in how the media affect our political behaviors and attitudes, and how our political behaviors and attitudes affect our media use.
- The University of Texas at Austin, where Talia is associate professor of communication studies
- The Annette Strauss Institute for Civic Life, where Talia is assistant director of research
- Engaging News Project, which Talia is director of
- Engaging News Project: Newsroom Focused Ideas for Improiving Online Discourse by Patrick
- 10 Things We Learned by Analyzing 9 Million Comments from The New York Times by Ashley Muddiman and Talia
- Survey of Commenters and Comment Readers by Talia, Emily Van Duyn and Cynthia Peacock
- Approve or Reject: Can You Moderate Five New York Times Comments? by Bassey Etim, The New York Time’s commenting quiz
- This is Why the Philly.com Comments Are Changing by Joel Mathis, featuring a video of reporters reading comments
- Community Signal episode with Bassey Etim
- Bassey Etim, community editor for The New York Times
- New York Times and Jigsaw Partner to Scale Moderation Platform
- Inversoft, a company that builds filtering solutions for user-generated content
- Community Signal episode with Derek Powazek
- Community Signal episode with Greg Barber
- The Coral Project, which builds tools to improve how journalists and communities engage on news websites
- Andrew Losowsky, project lead on The Coral Project
- Civil Comments, a comment platform touting “hands-free moderation”
- Greg Barber, director of digital news projects at the Washington Post
- 4chan and Reddit Bombarded Debate Polls to Declare Trump the Winner by Andrew Couts and Austin Powell
- Donald Trump’s tweet about the skewed online poll results
- Talia’s tweet about news media use of online polls
- Engaging News Project’s Quiz Creator
- Engagement Buttons by Talia, Ashley Muddiman and Joshua Scacco, a report on the use of “respect” buttons
- TribTalk, the opinion site of the Texas Tribune, which has used a “respect” button
00:04: You’re listening to Community Signal, the podcast for online community professionals, sponsored by Higher Logic, the community platform for community managers. Tweet as you listen using #communitysignal. Here’s your host, Patrick O’Keefe.
00:24 Patrick O’Keefe: Hello, and welcome to Community Signal. On this episode, number 42 of the show, we’re talking with Talia Stroud. If my voice sounds any different, it’s because I’m getting over a little bit of a cold. So I apologize for that, but I’m really excited to have today’s guest on the program. Dr. Stroud is an associate professor of communication studies at the University of Texas at Austin, assistant director of research at the Annette Strauss Institute for Civic Life, and director of the Engaging News Project, a research-based organization that examines commercially valuable and democratically beneficial ways of improving online news. She’s interested in how media affect our political behaviors and attitudes, and how our political behaviors and attitudes affect our media use. Talia, welcome.
01:02 Talia Stroud: Thanks for having me.
01:03 Patrick O’Keefe: It’s a pleasure to have you, and I’ve been familiar with the Engaging News Project for, I wanna say, a couple of years now. I’ve talked about it with some friends of mine that work in news media, and community, and moderation, have written about it before, read some of the research before, and always found it interesting and worthwhile. So I’m really glad to finally be able to speak with you here on the show.
01:21 Talia Stroud: Wonderful, I’m delighted to be on the show.
01:23 Patrick O’Keefe: So you analyzed all of the comments that people posted on The New York Times website from October 30th, 2007 to August 13th, 2013. That was a total of 9,616,211 comments. And you found that comments with partisan and uncivil words receive more recommendations, which is essentially the Time’s version of likes. So they don’t have a like button, they have recommend. And partisan words include references like Clinton, Bush, Dems, Repubs. Incivility is indicated by words by like “bigot,” “liar,” “dumb,” “hypocrite.” When a comment referenced the political right, it received 3.4 more recommendations than a comment that did not. And when it included both a reference related to the political right and an uncivil word, it received seven more recommendations than a comment that included neither. What do you think that says about the hope for thoughtful discourse in online comment sections, that the comments that are uncivil, and mention a specific side receive more recommendations than not?
02:20 Talia Stroud: I think that it demonstrates that it’s difficult to have a great deliberative conversation in comment sections, particularly when we see these sorts of patterns, even on sites like The New York Times, which dedicates a ton of resources to making that space as great as it can be. I also wanna just add a caveat here, which is to say that when we’re talking about comments that contain these sorts of words, we didn’t go in and read all nine million of those comments. What we did is, we looked for comments that contained the words that you suggested about the political right, and then these other words that were uncivil. So it’s possible that people are saying things like, “It’s too bad that people are calling Republicans uncivil.” Now, in our look over the comment data, that’s not in fact what was happening. These were the comments that are what you would infer from that finding, that were saying uncivil things about those on the political right. And I think it’s just a fact of our partisan and polarized era, that people are more likely to endorse those sorts of comments through their interactions in the comment section.
03:20 Patrick O’Keefe: And that’s a tough thing, because I think a lot of people, and certainly probably most people who listen to this show, and a lot of people who work in moderation in the media, want to reach a certain level of discourse. But there’s no arguing, and the data shows this, at least to this extent, that those sorts of comments, the more polarized things, the more aggressive, maybe the more far right, far left, get more attention, get more traffic, get more interactions with that content, at least as it is currently presented. And as such, they lead to more traffic, which for a lot of organizations is an important metric, time on the page, more traffic, more page views, return visits, etcetera. I don’t know, it’s just sort of an uncomfortable thing. It’s not what we want it to be, is it?
04:02 Talia Stroud: No, not at all. And we’ve heard from news organizations exactly the sentiment that you’re expressing there, which is it’s a very tricky space. Because if you wanna have more clicks and more people on your site, this content is one way, at least anecdotally, that people feel they can get that traffic to their site, by allowing and incorporating this polarizing content. One thing that I would add, however, when thinking about that, is that news organizations have multiple goals. One is, yes, to maximize the clicks on page, to increase revenue. But they also have this democratic obligation, this journalistic obligation as well, and these two goals sometimes conflict with one another. And I think we see that they can conflict in an instance like this.
04:45 Patrick O’Keefe: Yeah. And so you can see kinda both approaches, because there are some outlets who might just let people have at it in the comments, and do so because, well, first of all, they don’t have to pay moderators.
04:55 Talia Stroud: Yeah.
04:55 Patrick O’Keefe: So that removes staff time, and it gets more traffic. So it potentially, at least, it’s not always the case, but potentially could lead to more ad revenue based on impressions alone. Now, it’s not always the case. Sometimes low quality impressions will lead to less revenue, etcetera. But that’s at least the argument there, and it’s interesting to tie that into another survey that you did. You talked to a group of Americans, and you asked them a range of questions relating to online comments. And when it came to moderation, 42.2% of people felt that news organizations should remove offensive comments. But 41.6% said that comments should be considered free speech, and of course, the government definition of free speech is not the case, but for the sake of their argument, should be considered free speech, and not be policed. And if we believe that the key to inclusiveness and a high level of discourse is some level of moderation, do we try to bring that 41.6 to the light? Or, since there is not shortage of choice, do we simply allow them to move on to an outlet that gives them what they want?
05:51 Talia Stroud: Really good question. I think that the survey results show this general ambivalence by the American public about… We see the good things to moderation, and we see the bad things to moderation, and so I think that’s a really interesting finding from that study that we did. And I think when a news organization is then considering what do we do with this information, there are some people that like it, and some people that don’t, what should we actually enact on our own site? And I think that that answer will probably differ by site. But in the event that an organization really wants to create a space where things are more civil, where they have a hand in moderating the comments, and they have the resources internally to do that, I think that being incredibly transparent with their commenters about how they’re selecting comments and not selecting comments, is one way to try to bring people along, by justifying what they’re doing and demonstrating that they’re applying standards consistently. I think that can help people to say, “Okay, we see where they’re coming from.” And you also see a lot of news organizations make the case that, “The reason that we’re moderating comments, is because this is our site, it’s our prerogative to decide what takes place here.” And I think that news audiences can be receptive to that sort of an argument.
07:00 Patrick O’Keefe: To go back to that 41.6% number of people who think that comments should be considered free speech and not policed, I think a lot of that also is down to the fact that people don’t know what comment moderation or community management really is. And whether it’s news media, or online communities like I’m in, it’s people don’t know what goes on behind the scenes. It’s just like a mystical thing, maybe it’s a shadowy figure in a back room pushing buttons. They don’t understand what happens, and for them, when they view the comments, it’s already done. So whatever they think the comments are, good or bad, on a moderated community or a moderated news media site, the work is pretty much happening without them seeing it. They didn’t necessarily see what was removed, they didn’t see what was there before, so they don’t necessarily understand the necessity of it.
07:40 Patrick O’Keefe: And I think part of that is education. And The New York Times has been doing this quite a bit lately, as far as educating people on what happens behind the scenes, and has been for a while. What they do behind the scenes, how it works, what’s the value of moderation, etcetera. But I wonder if that’s part of that equation too, if that percentage could be knocked down by people. And this is probably just good transparency, especially for the news media, is saying, “This is our moderation, this is how it works, this is our policy, this is what happens when a comment gets removed.” And those are things that we’ve done in the online community space for a while, the idea of public guidelines and then how things get moderated. So I think that also could help a lot here.
08:13 Talia Stroud: Definitely, I think that there needs to be increased transparency about what happens to the comment after you’ve left it on a news site. And I agree, The New York Times has done a wonderful job. I’ve been thinking right now of, I don’t know if you saw the post going around, I’m sure you did, of try to moderate some of the comments from The New York Times site.
08:28 Patrick O’Keefe: I did.
08:28 Talia Stroud: Which was super interesting, and really well done. I think that that’s a helpful thing to do, to show people, “Look, they’re tough calls, and you have to try to do this.”
08:36 Patrick O’Keefe: We’re gonna link to that New York Times quiz in the show notes, it’s five questions. And I’m embarrassed to say that I went three out of five. I’m gonna admit that publicly right now. So I need them to train me up, I need them to coach me up a little bit. And it underscores the fact that every community is different. And I know that, and I approach different communities differently, but I didn’t have a good enough understanding of The New York Times commenting guidelines, to be able to be a good moderator. ‘Cause three outta five, not a good ratio, it just… That would not be good enough. And I have to say also, that my girlfriend, who’s a native New Yorker, was five outta five, and she doesn’t even work in this space. So she got it perfect, but I got stuck on a couple of questions. So take the quiz and see how you rank.
09:20 Talia Stroud: I think that there are a lot of interesting ideas about how to increase the transparency about this space. One really fantastic idea, in my view, the Engaging News Project holds a series of workshops where we invite digital news leaders, usually a very small group, to come together to brainstorm about solutions to some of the problems plaguing the news media industry. And at one of those, an idea that emerged was, wouldn’t it be interesting if after you left a comment, before the comment was finalized or maybe even something nearby the comment section, there was a picture of the person or a video of the person that was in charge of the moderation, saying, “Hi, I’m such and such, and my job is to moderate these comments?” Because I think it would put a human face behind what this job really is. The other thing that occurs to me is philly.com had put together a really interesting video, where they had some of the journalists read… I can’t recall if it was comments or social media, or some combination of the two, to their audiences, about them and their articles. And I think it really demonstrated to the public like, “Look, there are people behind these stories. And when you say some of these things, it affects real people, that this is their job.” So more efforts like that, I think would have a powerful influence.
10:35 Patrick O’Keefe: Yeah, I love the idea of identifying people. I think on my communities, I’ve always had the moderators, they’re identified as a person, and I’m identified as a person. And some people have suggested that in some cases, maybe the administrator, or the managers, or the moderators, should have a central account where they handle things from, and so there’s no real name, there’s no real person there, it’s just the admin account. And I think that is understandable in some ways, because of the backlash that we can face, but also does more harm than good in most cases. Because I find that it’s good that people know that I’m a person, and that I’m this person, and that I manage it this way. So even if they don’t agree with me, maybe it helps them to at least look at me with a tad of empathy, as opposed to feeling that I am some shadowy figure that is just sort of figuring out what to do with their content behind the scenes.
11:20 Talia Stroud: I think that having a person there with a face does affect people in a way differently, than some generic sort of logo. And some of our research looking at what happens when journalists or news stations get involved in comment sections, would support that idea that there’s something more about a human, a person getting involved. But in saying all of this, I also don’t want to be insensitive to instances in which, if the tone of the comments gets overly aggressive, that that may actually be detrimental to the person who’s in charge of moderating that community. And so I don’t wanna be insensitive to saying that there aren’t instances in which a moderator, for very legitimate reasons, might not want to get involved in that space. So I think that there’s an important sensitivity to context, when making these sorts of statements.
12:04 Patrick O’Keefe: I think that’s fair. I would like to take a moment to recognize our excellent sponsor, Higher Logic.
Higher Logic is the community platform for community managers. With over 25 million engaged users in more than 200,000 communities, organizations worldwide use Higher Logic to bring like-minded people together, by giving their community a home where they can meet, share ideas and stay connected. The platform’s granular permissions and powerful tools, including automated workflows and consolidated email digests, empower users to create their own interest-based communities, schedule and manage events, and participate in volunteer and mentoring programs. Tap into the power your community can generate for you. Higher Logic – all together.
12:43 Patrick O’Keefe: And you mentioned the idea of different goals being at play. Sometimes they have journalistic goals or quality goals, sometimes they are definitely traffic or metric-based goals and trying to raise those metrics. We had Bassey Etim on the show before, Bassey is a friend of mine, leads community at The New York Times. And as you referenced, they have this 14-team community desk, and they put in the time, with the goal of ensuring that the comments of The New York Times match the quality of the editorial of The New York Times. So that’s the goal there. And I think that makes sense for them, and they have the data that shows that engaged commenters become subscribers, which is a bottom line revenue metric for them. And then you have, of course, other outlets who take a totally different approach to it. And it’s interesting to think about those differences.
13:21 Talia Stroud: Absolutely agree. There’s just so much variability here, and it just depends on what the goal is of that space, and frankly what the resources of the newsroom are. Because there are numerous newsrooms out there, that have told us, “We would love to have beautifully curated comments, that go through extensive moderation, but we literally do not have the staff time to make that happen.” And we’ve had a number of organizations try to come up with algorithms that could identify uncivil comments, but I don’t know of any algorithm that says, “Oh, we have 100% confidence that you just put the comments through the algorithm, and then it’ll be completely fine.” There has to be some human curation that takes place at the same time. So I think it just depends on the newsroom, what choice they’re making, and the rationale that they provide for that decision.
14:05 Patrick O’Keefe: Yeah, and I know that The New York Times, speaking of them, recently announced that they’re partnering with Jigsaw, which is owned by Google’s parent company, to kind of take a look at the idea of AI in the comments. So I think that’ll be interesting to see, but the reality, and I think most people who work in community would say the same thing, is that it’s tough to see a time when the AI and the algorithm is so good that we won’t need humans. Like the idea of automation in community spaces, is that it tackles comment issues that are repetitive, that can be easily identified. So that has a lot of potential in comment moderation, and it’s already utilized. There are already companies out there, like an Inversoft, that may not be a company that a lot of news media or organizations know about, but it’s a company that serves a lot of online community spaces. You tie into their algorithm, that’s what they do, it’s what they’re good at. Is they focus on identifying, not only the types of negative comments, but also negative behaviors, like child grooming. And identify those behaviors automatically, and work to kinda knock them out. And then the nuanced issues, of course, get left to humans. But if you don’t have a human, as you said, some news media organizations don’t have a human free that can look at these things.
15:06 Patrick O’Keefe: When I hear that, sometimes I think that, yes there are some companies or some organizations that are so cash strapped, that they cannot justify it, and that’s fine and I understand that. And then sometimes I think that they don’t necessarily understand the value of hiring even one person, that they would just hire three more writers, or another author, or another journalist, when they already have 10, versus adding the first community person. Do you see that all? Do you think that’s true at all?
15:31 Talia Stroud: I think you’re definitely right, and I think it’s just a value proposition that hasn’t been tested clearly. So if we had some sort of experimental test, where some organizations really dedicated the time and worked on moderation, and had others that didn’t, and then compare it over time, are those that are dedicating the resources to moderation seeing an increase in traffic? Are they seeing an increase in people returning to their site? Are more people subscribing in comparison to those that aren’t? That would be really persuasive evidence, and I think we would see a lot of news organizations change in their practices, but we really don’t have that sort of clear evidence to sway people one way or another. And I think there’s a lot of popularly covered stories right now, about organizations that are shutting down their comments, and I think that that looms large.
16:17 Patrick O’Keefe: It’s also the type of thing where, it’d be nice if outlets… And I’m sure some are, obviously some are, because some are finding value or making changes. But the idea that, “Yes, let’s try it.” And it can be tracked, the question is, do they know how to track it? Or do they have the tools? Or is the platform that they are forced to use by the conglomerate, that sits over them as a publication, because a lot of small or local media are owned by one of the big overarching media organizations, but does their platform allow them to see, for example, “Okay, these are commenters. Okay, so these are also commenters who are subscribers. Are subscribers more likely to be commenters, than they are to be just the average everyday viewer?” And if you see that there’s a percentage gain, then you can kinda forecast out what that gain represents. So there’s value… If subscriptions are your model, if ads are your model, then obviously you can look at page views, and people who are commenting, and the page views on stories that are popular with the comments, etcetera. So the data can be there, if someone wants to experiment and hire someone on the cheap to be a moderator for a year. But the question is would they even be able to know how to access the data, or have the tools to do so?
17:16 Talia Stroud: Really great question, and there are kinda two aspects to this, I think. The first is, do they have the analytics to know who the commenters are, and to track them through the rest of their media experience? So are they subscribers, are these folks that are returning to the site more frequently? And I think you’re correct that lots of organizations do not have access to that data, because the commenting platform that they use, doesn’t provide them with any unique identifier that they could connect to the person, and then find out how they’re traveling in other parts of the site. So that makes it next to impossible to find out what’s happening there, from a data analytic sort of perspective. I do think though that there are opportunities for news organizations, even those that are owned by a parent company, to do some of this experimentation in their newsroom. And I think that that’s why I founded the Engaging News Project, just to really work with news organizations that have an interest in answering these sorts of questions, because we can help them to design some sort of a study or a way to analyze this internally, that doesn’t rely on using those sorts of analytics. So for example, you could envision some sort of a study where, on some subset of articles but not on others, you have heavy moderation on some and not others.
18:24 Talia Stroud: And then you track over time, what happens on those where you moderate and those that don’t? Are there differences in the quality of the discourse? Are people returning to those pages more frequently? Are you getting responses from your audiences, saying, “Hey, this article was so great, I really appreciate that there was some moderation taking place?” And we’ve done some work related to that that I think is really promising. So I think that the limits that news organizations face in terms of their analytics, don’t necessarily need to be limits on their creativity and they’re thinking about how to analyze how these things work in practice.
18:57 Patrick O’Keefe: And when you think about moderation, versus non-moderation. For me, non-moderation is not even an option. When I talk to people about community, you can’t just throw up a box somewhere. It’s like throwing an empty box on the side of the street. When I had Derek Powazek on the show, and he’s a respected person in the community space, long time veteran. And he made a great analogy, to just “If you throw a cardboard box out to the side of the street, it’s not gonna fill up with gold, it’s gonna fill up with garbage.” And then the same is true here, like that box needs to be prettied up, needs to be well lit. Post some rules on the side, like what the heck goes into that box? And the same is true here with a lot of news media organizations. And I’m sure I’ve talked about this, when I had Bassey and when I had Greg Barber of the Washington Post on this show. But the idea that… What a lot of news media organizations think of comments, is the fact that they threw up a box, and did nothing. And then that’s the ROI of comments, is whatever came from that box with no strategy. And so it’s not even that they have comments or not, it’s that they never even gave it fair shake.
19:54 Talia Stroud: I think you’re right. And I think at the beginning, putting a comment box on a website, it was kind of a novel, interesting thing, “Let’s open this up, the internet could be revolutionary, we could create a wonderful space for citizen discussion and participation.” And those are really, really well intentioned idea, and I think that audiences also got used to that. So now, to audiences a comment section in most news media forums, means the space where anyone with any view can spout off a little bit. And if you’re not doing anything in that space, then we see the results of what happens there; you can see spam, you’ll see all sorts of trolling. It becomes a place that’s not this ideal vision of what a deliberative situation, or what a beautiful commenting space might happen to be. It can go down to the lowest common denominator notion. So, I think that we’re seeing more and more evidence that organizations that are dedicating time and thinking about this space, are coming up with ways to make it better. And so I think that that’s really encouraging, and I’m really optimistic about those organizations that are willing to dedicate the time, to really understand how to provide a space for citizens to interact with the news media.
21:04 Talia Stroud: Because that, I think, is the real power of moving more toward digital news. And I think that communities, and citizens, and audiences, whatever term you wanna use, have a lot to share with the news media, and news organizations can benefit from that interaction. It’s just figuring out the right combination to put it together, to get to a point where it’s this great space.
21:24 Patrick O’Keefe: You mentioned news media organizations closing their comments. And it seems like when that happens, it’s a story. No one reports on the opening of comments, or that “They’re still there.” But this organization, whether it be NPR or Popular Science, or whoever, is closing their comments. What’s your position on that? What’s your position on a news media outlet opting to completely remove public reader to reader engagement, on their own websites, be it comment sections or something deeper, and simply pushing all of that dialogue strictly to third party media platforms?
21:57 Talia Stroud: I understand why news organization might do that. If you don’t have any of the resources to moderate that space, then it’s turned into a place where people are aggressively debating things with each other, in uncivil terms, and it’s just turned into kind of a pit. I understand why a news media organization may make that decision. I do think that it is too bad, because we see all of these instances in which other entities are encroaching upon what the news media used to do. So we see Craigslist coming in on where we used to have classifieds, we see weather.com coming in to the weather. And as news organizations start to allow more other entities to siphon off part of their business model, it makes me concerned for the news media. So in shutting down comments and allowing all that to happen on a social media platform, a news organization, is again kind of giving up that part that used to attract people to their site, at least some portion of people to their site. And so I think that that’s unfortunate, that they’re not getting that data, getting that relationship with people. I think it’s potentially a dangerous thing to keep doing longterm. So I think that organizations would be better served if they don’t like what’s happening in the comments space, to think of ways to improve upon it, if they’re able to, or to think of another idea, generate some out of the box idea for engaging people on the site.
23:20 Patrick O’Keefe: People ask me when I talk about news media, ’cause I do talk about news media comments, and why I think comments are good, and how I think it should be done. When people ask me, “What’s a great example of news media comments sections?” And I’m sure you get that question all the time. Now for me, I often point to The New York Times, as we have here already. So, I think that’s a pretty common example. They do a great job, they have great resources, they put 14 people on it. It makes sense. Are there any other… And I’m sure you know a lot of people, so I don’t want you to offend people by omission. But, are there a few really good examples of news media comments, that you point to when people ask?
23:50 Talia Stroud: I also often point to The New York Times because of how many resources they dedicate to that space, to making it a really vibrant space. But, I think that they’re also, and I’ll be generic, I think that there are also some local media outlets, that are really doing a nice job in the comments. And I think part of that, is because if you’re local enough and you have a community feel there, people actually know other people in the community. So there is a social pressure sort of feeling there about what comments are acceptable and what aren’t, and I think that that helps to create that space. Now that’s not really the news in media doing anything special, it’s them taking advantage of being in a smaller community, where knowing other people affects the quality of the comments that are left in that space. So, I think that that’s actually an interesting lesson potentially, for those outlets that are larger, is thinking about how do you create that sort of a sense of community where people self police almost, in what comments they’re leaving in the space.
24:43 Talia Stroud: I also think that there have been creative ways to think about comment sections. So, the Washington Post has done some comment sections that don’t necessarily allow people to leave any comment whatsoever, but they’ll give them a series of choices. So, “Which of these following choices represents most what you feel?” And then they’ll check a box, and then after that they could leave a more open-ended comment, but I think it directs the conversation in a really nice way. So I think that that’s also a productive way to use comment spaces, or to direct the conversation more clearly. We also worked with an organization that was looking at having journalists get involved in the commenting space. So what happens when journalists go into the comment section, and answer people’s questions for instance, and I think that’s another example of a really strong practice that can help to cultivate a better space for people to discuss issues.
25:35 Patrick O’Keefe: Speaking of reporters getting involved in the comment sections of their articles, there are various reasons why many choose not to. You know the amount of volume, rude remarks, personal attacks, that’s a couple of the reasons. And I’ve thought about this problem, and on the show previously, I thought that it would be helpful if along with just denying and approving comments, that if you had community resources or moderation resources, that that team actually identified comments, and essentially sent them to the author of the piece, through a dedicated dashboard in the commenting software, if they had the capability. So they can get to those pressing questions or those thoughtful comments, without having to sort through the rest that kind of drags them down. Do you think that is a viable solution? Or what’s the answer, how are people dealing with this?
26:19 Talia Stroud: So, I think that’s a great idea, I would love to see a news organization try something like that, where they’re filtering somehow the comments out. And I know Coral Project has a lot of different ideas on their plate.
26:29 Patrick O’Keefe: Actually, I mentioned that idea. It’s funny you mention that, ’cause I actually met up with Andrew from the Coral Project a month ago in The New York Times’ building, and I’m pretty sure I told him that idea. I said, “Here is something that you should maybe do.” So yeah, I’d love for them to do that, that’s funny.
26:43 Talia Stroud: Yeah. I really like that idea because I think it does deter people a little bit from getting involved if they see a lot of uncivil comments. From what we have heard from journalists in general, most at least occasionally do glance at the comments. So it’s not really a no-man’s land, but it certainly can be off-putting if you don’t see comments that are more engaged in the conversation. I’ll share with you one bit of research that we did, that I found to be pretty interesting and related to this. I mentioned it just briefly earlier. But what we did is, we worked with a news organization, and across several months, on some days, we worked with a prominent political reporter who would go into the comment section and answer factual questions, or ask people questions that were of interest to him, or complement people on a strong comment. On other days, we had the station… A generic station logo. This was a television news station. We had a generic station logo and a staff member go into the comments and do this. And on yet other days, we have no one go into the comments, and we tracked overtime what happened to the substance of the comments, with these three options.
27:44 Talia Stroud: And what we found is that when the reporter was involved, that this was not the case for the station. But when the reporter went in, we saw the amount of incivility decrease by about 15%, and we saw people provide more evidence for their comments. So about 15% of people were providing more evidence for their comments. And I think that this demonstrates some of the effects of having journalists get involved in the comment section. And again, there is the issue of resources. But the thing I would also add is that the reporter in this case, got involved to maybe four, five times on average in the comment stream, so it wasn’t as though he was responding to every single comment that appeared. And even that minimal level of involvement actually made a difference in the space. So it may be thinking outside of the box by doing things that the journalist can just do on their own, that could improve what’s happening there.
28:35 Patrick O’Keefe: That’s interesting. And when I think about journalists, and getting into comments, I really don’t like it, or I don’t think it’s helpful when… And it happens on more of a local level, but when journalists are involved in moderation of the comments themselves, I feel that’s not a good recipe, for a couple of reasons. I feel at least journalists can get even more beaten down. They have a job… It’s writing, and it’s being a journalist, and it’s creating content and all these things. That’s a job. Moderation is a job. And again, we’ve talked about cash strapped resources, so we can’t always afford to hire people, so that’s why this happens. But that probably leads to such a bad view of the comments, when you have to be the one who writes the story, and then who sees the feedback to your story. And doesn’t just deal with the thoughtful feedback, negative-positive, but also just the awful feedback on the story. It’s just gives you such an awful view of the comment section, because as much as we can be tough and even-keeled, people are still human. And so, eventually you’re gonna get beaten down by that stuff.
29:29 Talia Stroud: Yup. And there definitely are other options, beside having a journalist moderate, or even having a moderator as part of the staff. I’m thinking here of Civil Comments and some of the work that they’ve been doing, trying to have the community rate comments before their own comment appears on the site. And I don’t know how well that is working or not, so this was not an endorsement. But it’s a really interesting idea that they have, about what if the community has to self-moderate in some way, before their comment appears? And I think that idea is really intriguing.
30:00 Patrick O’Keefe: Yeah, I think that’s interesting too, and I think Greg Barber of the Washington Post mentioned that when I had him on the show. And one thing that strikes me about that is, it can only be certain type of comments, because we don’t wanna throw the bad comments, the really bad ones out there in front of our readers, and say “Specifically, look at this one and rate it. Here’s the worst. Please choose it! I’m sure your day is better for having seen that.” That just speaks to the nature of the problem, which is tough, because someone has to look at it. And if it’s not the news media organization, then it’s probably the readers.
30:27 Talia Stroud: Yup. And if you’re not moderating, then you’re putting it in front of them regardless.
30:31 Patrick O’Keefe: Yeah, exactly. Following the first presidential debate, The Daily Dot wrote about how Reddit and 4chan had worked to skew online poll results toward Donald Trump, which he of course ended up tweeting as proof that he had won the debate. You tweeted about that story, and you said that news orgs should not put polls on their sites. And it’s interesting to me, because polls… Being someone who’s developed websites since the mid to late 90s, I probably had a poll on my first personal website on Angelfire back in ’95 or ’96. Polls are such a long established part of the web, but they are also a part of the web that long time practitioners know are very, very open to abuse. And the fact that they are such a natural part of the web… It’s evidenced in the fact that Twitter added polls less than a year ago, as a new feature. And they’re still there, people want polls. But having online polls and having actual scientific polls, and polls that are properly conducted that actually reflect the opinion of a group, two different things, obviously.
31:28 Patrick O’Keefe: Online discussions may require a little more buy-in than simply clicking an option and pressing submit, but they can also be manipulated similarly by coordinated groups. And really, they already are. We know this is happening, there’s been stories about it. Groups get together, they manipulate the comments or an online forum, an online community. They get together and they post the same opinion, or they post in a way that makes it look natural, but really, they’re just sort of manipulating the information that the public’s getting. It’s such a tough problem, but it’s a tough problem for small organizations. But how do you counteract that? How can we maintain some balance in the comments section, to try to limit the impact that those sorts according to campaigns can have?
32:05 Talia Stroud: It definitely can be a problem, on both in form of online polls and in form of comment sections, to have people go in and populate a space with their particular view. And it’s a problem that they call exemplification, which essentially is including a bunch of little narrative stories or case studies within even a news article, because this happens all the time, even in news reporting. And what they find is that these sorts of case studies, or vignettes in an article, can really affect what people think about the broader public. So if I read a bunch of vignettes, all of them indicate that Trump won the debate, I may start to think “Oh, maybe the public in general thought Trump won the debate.” And one thing that they’ve tried to do in that area of research to counteract this, is include some statistical information. So perhaps from a poll that’s been conducted using more scientifically rigorous methods, saying “Hey, here is the actual data about the percentage of the public that believes that Trump won the debate.” And the crazy thing from some of this research, is that people in fact remember the distribution of opinion from the vignettes, more than they remember the statistical information. So it doesn’t fix the problem…
33:13 Patrick O’Keefe: I think anyone watching the presidential election this year can probably believe that one.
33:17 Talia Stroud: Yes, and one thing that they’ve done though, is if you try to make that statistical or base rate information even more prominent. So if you include it more than one time, if you underline it, if you put a note somewhere really clearly saying “Based on a very carefully done mythologically rigorous poll, done by Organization X, maybe the pure search organization, the actual result was… ” And then include that information, but displaying it more prominently, that does seem to counter some of the affect of having these narratives that sway the distribution in a way that’s not commensurate with public opinion. So I think to the extent that including base rate information is possible, I think that would help people to understand what’s happening.
34:01 Patrick O’Keefe: Include real data. Include facts. Include information.
34:03 Talia Stroud: And highlight it.
34:04 Patrick O’Keefe: Do you feel like the inclusion of online polls like this is kinda sloppy by major media organizations. Certain orgs, or certain people that he decided it his tweet, Trump did, would not be viewed by me as credible media outlets, but there are some that would be. And those are big names that people know and a lot of the country respects. Online polls, should they simply be dead for those traditional media or respected media organizations?
34:27 Talia Stroud: I believe that they should. I cannot believe that news media organizations are continuing to put polls that they know are completely unscientific on their sites, when we know that people can use that data, and believe that it represents something larger than it really does. It’s a tricky thing, because if you’re a news media organization and you are in the business of informing the public, putting an online poll on your site, runs the risk of actually misinforming the public. It’s doing the exact opposite of what the whole objective of the news media is. So I just firmly believe that online polls are not something that should be included. And even when they are included, often times you see no disclaimer, so no statement saying “This is truly for your amusement only. Don’t read anything into this data, they represent nothing at all. They don’t represent viewers on this site, they don’t represent anything. It’s just basically a game.” So I feel very strongly that news media organizations should not be using online polls on their sites at all. And in fact, as an alternative, the impetus I think to include it is you can get a lot of clicks. People find that interesting, it’s fun to do, it’s kind of curious to see what happens.
35:38 Talia Stroud: But I think that a stronger thing to do, would be to include a quiz, that includes factual information afterward, that provides people with the opportunity to learn more about what’s happening. So those can be engaging as well, and would meet the mission of the news media to inform the public. And Engaging News Project actually developed a quiz tool that’s completely free, that over 20 news organizations have used to date, but I think is a nice alternative to the online poll.
36:03 Patrick O’Keefe: Polls dead. Got it. That works for me, I don’t really… I feel polls… I remember when I’ve launched different websites over the years. I had a stock market website and a sports website, we’d throw polls up there because it was natural and it made sense at that time. And to be fair, we weren’t talking about things that were super important. It doesn’t matter if you think the Bears or the Steelers are gonna win the Super Bowl. That’s not really a matter of critical importance for the country. So you can still have polls for stuff like that, that’s fine, have polls for that stuff.
36:30 Talia Stroud: Yes, for entertainment purposes and acknowledging that that’s the purpose of this, it’s for entertainment.
36:36 Patrick O’Keefe: Exactly, entertainment. Like, “Who won the Bachelor?” That’s a perfect poll topic. Who should be president is not.
36:41 Talia Stroud: I would agree.
36:43 Patrick O’Keefe: Microactions continue to creep into community spaces. Years ago you tested the use of a respect button, as opposed to like or recommend. And I like that, it’s something I wrote about, something I’ve talked about with people in person. The idea of having… How I described it as, sort of an emotion-neutral microaction. You don’t like it, you don’t love it, you don’t hate it, you just respect it. You acknowledge it, it happened, you can respect this perspective. And in your research, you found that in at least some context, a respect button was more commonly used, and used by more people who disagreed with the premise of the particular post they were reacting to. But I haven’t really seen anyone give it a serious go. Facebook added a series of emotive microactions earlier this year, and they added love and anger and sadness. And that’s fine, I kinda hoped they had had a respect in there, as I bet you probably had that thought too. But respect wasn’t one of them, and it makes me think, is this a concept that you’re still talking about, is it one that you still feel has potential?
37:43 Talia Stroud: We definitely think that this still has potential. As you said, in our research, what we found is that people were more likely to click on partisan comments that didn’t agree with them when they had a respect button there, compared to a like button. So a Democrat would be willing to respect a Republican comment, even if they didn’t like it. And I think that these sorts of ideas, to improve the space, have so much potential, and this is just one very small component of it. But if we found a whole series of things that could help improve the space, and make it better in terms of people encountering views unlike their own, I think this would be wonderful. TribTalk, which is the opinion site of the Texas Tribune, adopted the respect button and have told us they believe that it’s one of the factors that contributes to increased civility in that space. So I’m really encouraged by that fact, and would love to see more organizations, especially Facebook, that would be wonderful, adopt the respect button. Because we do believe that it does have an effect in terms of how people react to political things, when they see them online. So I think that that idea specifically and other ideas connected to it, have so much potential, and I really hope that we see them adopted more frequently in the future.
38:50 Talia Stroud: I think one of the reasons that these things aren’t easily adopted, is because so many places are locked into a comment platform, so they are locked into a contract with Facebook, or with Disqus, or one of the other prominent organizations that don’t have that button as an option. I will make it my plea. I think the research supports, including a respect button, or at least experimenting with some of these ideas, to try to make partisans acknowledge views, unlike their own.
39:17 Patrick O’Keefe: It’s really not that hard for a platform provider, for a vendor to add additional microactions. You could even make it so it could be self-defined, you could allow people to set their own micro actions and then it extends into the community. I have like no money, and when I redesign my community, I’m planning to add microactions and we’re gonna have something in there that’s akin to respect, because I wanna play with it and see how it works, and kinda cut down on those short replies that you get, which is one of the benefits of microactions. Is if people like something, they don’t have to reply and say, “Thank you.” They don’t have to reply and say, “I respect that.” They can just click “Thanks,” or they can click “Respect.” So like I said, I have nothing, I have no resources right now… Not exactly, but close enough… And I’m gonna experiment with this, so anyone really can. Talia, thank you so much for coming on the program.
40:00 Talia Stroud: Such a pleasure, Patrick. Thank you for doing this podcast.
40:02 Patrick O’Keefe: We have talking with Dr. Talia Stroud, an associate professor of communication studies at the University of Texas at Austin, assistant director of research at the Annette Strauss Institute for Civic Life and director of the Engaging News Project, a research based organization that examines commercially viable and democratically beneficial ways of improving online news. Find out more about them at engagingnewsproject.org. If you have any questions that you’d like me to answer on the air, please submit them at communitysignal.com/qa. For the transcript from this episode plus highlights and links that we’ve mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and we’ll see you next week.
Do You Have a Question for the Show?
If you have a question that you’d like me to answer on the air, I’d love to hear it! Please submit it for a future Community Signal Q&A episode.
Thank you for listening to Community Signal.