But what is a reasonable expectation for the public, when it comes to people who use live video to gain attention for their violent acts, against themselves or others? Heather Merrick, community experience manager at group video chat service Airtime, joins the show to discuss. Plus:
- How allowing users to switch video chats from public to private, and back, complicates community management efforts
- What happened when Tumblr switched replies off on their platform
- Unethical behavior and the implications of getting caught
“It’s nice to have that option, [to flag content simply because you don’t like it,] because it’s essentially, ‘We want you to feel comfortable.’ It’s the comfort option. ‘Maybe I just don’t feel like seeing this slightly icky image that isn’t really offending me deeply but isn’t really my vibe today. I don’t want to see that.’ You can hide it, and I think that’s useful. You don’t necessarily want to be getting someone else in trouble. I think some moderation tools can feel really serious, like if you’re blocking someone on various platforms. … Are you notifying the person that you’ve blocked them? If so, that’s kind of like a big step in that relationship, right?” -@heatheremerrick
“Ambiguity is not usually the friend of community.” -@patrickokeefe
“In the world of engineering, it’s so hard to nail down an exact date for when something’s going to happen because something unexpected will always arise, so I completely understand the need to have somewhat ambiguous timelines. I think if you were to say, ‘Hey, we’re embarking on this gigantic project and we guarantee it will be live by July 3rd,’ you’re going to let people down. … There’s somewhere in between, where you can sort of be like, ‘Hey, this is going to take us a few months and we promise we’ll give you updates along the way, so stay tuned. Two weeks from today, we’ll let you know.’ That kind of thing. It could be considered overcommunicating.” -@heatheremerrick
About Heather Merrick
Heather Merrick is an experienced community and support team manager. She’s worked for startups since 2008, including Tumblr, Stripe, Automattic (makers of WordPress.com) and Airtime, where she is currently community experience manager.
- Heather’s website
- Airtime, a group video service, where Heather is community experience manager
- “19-Year-Old Commits Suicide on Justin.tv” by Liz Gannes for Gigaom
- Twitch, a video game livestreaming service and community
- “Man Wanted to Facebook Murder Video Kills Himself, Police Say” by Hannah Kuchler for Financial Times, about the man who Facebook to announce his intent to murder someone, commit the act and then confess to it
- “Community Standards and Reporting” by Justin Osofsky, VP, global operations at Facebook, covering Facebook’s response to the event
- “Genius: $56.9 Million in Funding, 6+ Years to Add a Report Abuse Button” by Patrick, about Genius’ poor approach to community abuse
- Google Forms, which allows you to create surveys
- Periscope, a live video service where some Airtime users had previously connected
- “Your Replies Are on the Way, Tumblr” by Tumblr staff, covering the platform’s response when users were unhappy that they removed replies from their service
- Elizabeth Tobey, formerly Heather’s boss at Tumblr
- “Unroll.me is Sorry-Not-Sorry it Sold Email Data to Uber” by Jon Fingas for Engadget
- “One Billion Yahoo! Accounts Still for Sale, Despite Hacking Indictments” by Vindu Goel for The New York Times
- “Uber’s CEO Plays With Fire” by Mike Isaac for The New York Times
- “The Price of Nice Nails” by Sarah Maslin Nir for The New York Times
- Heather on Twitter
00:03 You’re listening to Community Signal, the podcast for online community professionals. Tweet as you listen using #CommunitySignal. Here’s your host, Patrick O’Keefe.
00:20 Patrick O’Keefe: Hello, and welcome to Community Signal, episode number 68. I am your host, Patrick O’Keefe. This week, I am joined by Heather Merrick. We’re going to be talking about moderating live video, communicating what happens when content is reported, and the growing norm of security hacks. Heather is an experienced community and support team manager. She’s worked for startups since 2008, including Tumblr, Stripe, Automattic (the makers of WordPress.com) and Airtime, where she is currently community experience manager. Heather, welcome to the program.
00:48 Heather Merrick: Hello! Thank you for having me.
00:51 Patrick O’Keefe: It’s a pleasure to have you. So, I want to get right into Airtime. Airtime is a group video chat, both public and private. This is really an interesting space to me because of the challenges that exist for abuse with live video. You know, I don’t want bad things to happen. I don’t want people to hurt themselves or others. I’m not hoping for it. I don’t want it to happen. But the reality is that the bigger your company gets, the more people use it, the more that just becomes almost a certainty. And not you as in Airtime, but you as in startups who have UGC of any kind, especially live video. And I think back to 2008, when a teenager committed suicide on Justin.tv, and it was a huge story. But whether it’s suicide or murder or some sort of violent crime, sometimes, the people who do these things want attention. They want an audience, and these are the platforms they will go to. Are you sure you still want to do the show? [Laughs]
01:42 Heather Merrick: [Laughs] Yeah. I love that you came right out with this topic really swinging at me.
01:46 Patrick O’Keefe: Right out of the gate, right out of the gate. You know, I don’t beat around the bush here. But seriously, does that sort of thing keep you up at night?
01:51 Heather Merrick: It hasn’t until right at this moment. Maybe I will not sleep so soundly tonight. This is good timing, because we actually sort of decided today that we will make monthly meetings a thing within Airtime around the topic of content moderation in particular, and for us, that isn’t just live video, because we are an app that also supports posting content from other sources, like YouTube, Soundcloud, Spotify, and you can text message within the chat, as well. So, we have people who are reporting all sorts of content, not just live video. But I’m feeling really lucky that, so far, there hasn’t been a scandal along the lines of the ones that you’ve mentioned there, and we do really care, obviously, about preventing something like that from happening, and from really giving people the option to feel in control of a situation, if there was a situation along those lines, and we want to be informed. Like, we don’t want to discover days later, or even hours later, that something really disturbing has happened on our platform at all.
02:55 Heather Merrick: So, we do give people tools to flag content. We are working on really perfecting those and making it obvious to people what will happen on our end when they do that. So, yeah, there’s always room for improvement. I think Twitch is an example of a platform that seems to have a really robust set of moderation tools. They’ve been around for a while and they have a huge user base. We’re not really, like, Twitch level at the moment, but I think it’s important to have people like them to look up to and set a really good example for that sort of stuff.
03:33 Patrick O’Keefe: Maybe that’s a good idea. Maybe I should get someone from Twitch on the show, if Amazon will give me an okay! It’s interesting, because I think that it’s unfortunate that a lot of startups don’t really think about these things until they happen, and as I said to you via email before the show, I think that it’s good for companies to not only have a competent community person, but also to listen to them. And those two things don’t always go hand-in-hand.
03:54 Heather Merrick: Right.
03:56 Patrick O’Keefe: So, it’s awesome that you’re working on those things. I’m interested in the idea of… you know, you mentioned flagging videos, but actually making sure it’s clear to them what is happening after that happens. Talk about that a little bit. So, I’m imagining some sort of report functionality, a flag, a button. Someone hits something, they might select a reason, maybe not, and then it goes off to you. What do you want to make clear to them what’s happening?
04:16 Heather Merrick: Across different platforms, I think flagging accomplishes different things, and we should make it really clear who we are similar to and who we are not similar to. At the moment, if you flag content in Airtime, you are hiding it only for yourself and submitting a report to our team. But for the rest of the people in the chat, it actually persists in the room until the team has reviewed it and taken some type of action, if necessary, on the content. So, I think an improvement that we can make—and we’ll always be improving—would be to make really clear here, this is just hiding it only for you. And also in the future, we definitely want to introduce some more tools that will maybe give a supermoderator within a chat the power to hide something from everyone, and they have that discretion. Again, that’s something we were talking about literally today, so you’re really getting the scoop on our content moderation tools there.
05:13 Heather Merrick: But yeah, I think it’s important to be clear. Facebook has—or at least had; not sure, it’s always changing—the option to say, “I don’t like this,” which essentially doesn’t do much other than hides it from you, as far as I know. But they also have a host of other options that are really granular, and I think that’s great, because it’s important. I’m sure the people who are on their team looking are sorted into all sorts of buckets, and that helps make things efficient, for sure.
05:42 Patrick O’Keefe: Yeah. I mean, that’s sort of the de facto “don’t waste our time” option, right? “I don’t like this.”
05:47 Heather Merrick: Yeah.
05:48 Patrick O’Keefe: It’s not really a policy violation. There’s not anything wrong. It’s just essentially acknowledging the fact that the biggest problem that people have with flags… And I’m a believer in report options, flag options; I think you have to have them. There’s really never a case where the downside is so bad or it’s so time-consuming. It always works out. But the biggest problem is false flags. And so, you know, people who are flagging things that are obviously not a violation, or are flagging things because they personally don’t like it, so that’s their way of giving someone sort of the feeling they did something, but also kind of either ignoring those, or at least putting them very low on the prioritization list of issues that are reported.
06:25 Heather Merrick: Yeah. I think it’s nice to have that option, because it’s essentially, like, we want you to feel comfortable. It’s the comfort option, where you’re like, “Maybe I just don’t feel like seeing this slightly icky image that isn’t really offending me deeply but isn’t really my vibe today. I don’t want to see that.” So, you can hide it, and I think that’s useful. You don’t necessarily want to be getting someone else in trouble. I think some moderation tools can feel really serious, like if you’re blocking someone on various platforms. Again, there’s sort of a variation there of, are you notifying the person that you’ve blocked them? If so, oh, that’s kind of like a big step in that relationship, right?
07:06 Patrick O’Keefe: Yeah, that’s why I never block anyone on Twitter. I don’t want them to know. A mute is very useful. Block, for me, not so much.
07:13 Heather Merrick: Yes. Thank God Twitter does have mute now. That was useful, and a long time coming, and I think they could do so much more. But that’s a whole podcast episode in itself, I think.
07:22 Patrick O’Keefe: Airtime not only allows for both public and private group chats, but you know, you have the ability to switch them back and forth. How does that complicate things?
07:32 Heather Merrick: Yeah. I think there are other services where it’s really clear that you’re either creating a private hang-out space that’s maybe just you and your SO or best friends, and we have that ability, but that private space can become public. I think we do an all right job right now of making it really clear when it switches over that that’s happening, and it actually clears the history in the room. So, if I’ve shared photos in our private room, if you felt like potentially, for malicious reasons, switching to a public room, everything would be cleared and there would be a clear warning about that. And I think that’s one way of making that transition option one that doesn’t have a lot of risks associated with it.
08:19 Heather Merrick: But again, I think there’s always room for improvement in terms of communication. Like, should we send some sort of alert to every single member of the room when that happens? What should it say? Because right now, it’s really just the person who’s making the change that gets notified, and then the room history is cleared. But again, we’re a pretty new platform, and we have time, luckily. Knock on wood. I’m really tempted to knock on my desk right now.
08:45 Patrick O’Keefe: Is there a difference, from a community management, moderation, whatever perspective you want to take, in how you approach the private versus public? Am I right that private means private, semi-private, at least, in a sense that it’s only between those people, and then if it’s public, then it’s something you see? Or how are you sort of approaching that issue? Like, someone starts a private video conversation, I don’t know, maybe they feel threatened or harassed by that person. Is there any recourse there? I’m just kind of interested in that.
09:12 Heather Merrick: Yeah. Again, I’m feeling so grateful at the moment, because this isn’t really an issue that we’ve had to face very much at all.
09:17 Patrick O’Keefe: Yep. That’s what I do. People get me on email and I tell them what could go terribly wrong.
09:22 Heather Merrick: You know, I haven’t really had a situation where someone said that they were being harassed in a private room, but it could happen, and we do have tools that are available if they feel that they need to do that. And we also have a general, of course, contact form, but I’m not going to pretend that the contact form is an amazing resource for users. A lot of people are not fans of contact forms.
09:43 Patrick O’Keefe: Yeah, no doubt. Well, I mean, thinking about it is more than a lot of people do, so…
09:48 Heather Merrick: Right.
09:49 Patrick O’Keefe: I’m just, like I said, really interested in live video and the approach to it. So, speaking of live video, and you mentioned Facebook, so a couple of weeks ago, a man posted a video on Facebook announcing his intent to commit a murder. Two minutes later, he posted a video of the actual shooting. 11 minutes after that, he used Facebook’s live video feature to confess. Facebook disabled the account two hours and 30 minutes, according to them, after the first video was uploaded. It’s unclear when the first reports came in. They’ve kind of changed their story on that, but obviously, it was within that two hour window. And in the blog post, they said they needed to do better. Now, this is a tough thing, because Facebook is in a tough spot, and obviously, let me be clear, a very tragic situation. “What if it was your family?” someone might ask. Yes, yes, awful. Awful and terrible for the victim’s family, every minute the video was online.
10:35 Patrick O’Keefe: But do you know how much is posted to Facebook every second? How much video, how much live video, how many reports they receive, how many false reports, like we just talked about, that they have to sift through? Two hours, honestly, isn’t a bad response time, and I kind of think we have to be careful about conditioning the public to expect responses in, I don’t know, five, ten, 20 minutes. Now, don’t get me wrong. There are companies who, I don’t know, exhibit kind of a willful ignorance to this sort of thing, and I’m no friend to them. I’m critical of companies in the community space that don’t take a serious enough approach to these issues? I wrote about Genius.com, who took six years and $57 million in funding in order to add a report abuse button, and up until that point, they required possible abuse victims to join their community and post a public comment in response to the alleged abuser.
11:25 Heather Merrick: Mm.
11:26 Patrick O’Keefe: So, not a good system! And I mean, there aren’t any good ways to spin that, right? You’re either stupid or you’re negligent, but either way, it was good they finally added one. And with that said, though, we have to be understanding of those working in good faith to address these challenges, and as platforms, I feel like we do ourselves a disservice when we set unrealistic expectations.
11:51 Heather Merrick: Yeah. I think that situation’s really horrifying, and in a way, it’s funny that I’m in a role where I’m doing content moderation, because I’m actually a really squeamish person. I’m not going to watch those videos. I just don’t choose to watch those videos. If I was working there and I had to, I would do it, but…
12:05 Patrick O’Keefe: No, I haven’t watched them, either. I’m not going to. Exactly. I don’t want to.
12:08 Heather Merrick: I have no plan to watch them on my own, and yeah, it’s really awful. And at the same time, I do have compassion for the people who are on those teams at Facebook who are tasked with keeping up with the demand. I mean, it’s massive, right? And as you said, false positives. We even have that problem, and we’re pretty new. So, of course, Facebook, which serves a significant percentage of the people on this planet, they’re going to have a lot of those, too, and I do think it’s fair to say that it’s going to take them a little bit of time, for sure. I’m sure they also have learned a lot of lessons from the data that came out of that particular event, in terms of you have to imagine that there was some sort of velocity of reports that contained the same keywords or something. You know, there’s some detection that you can potentially do. I think it is a really hard issue on all sides.
13:03 Patrick O’Keefe: Yeah. I almost… When I read some articles that were critical of this, or read some comments, I almost thought about police. Like, when we call the police, we want a response ASAP. We want them there. You know, five minutes, ten minutes, 20 minutes for police is what we expect. Should we view platform owners in that way? And I don’t think we should, even as platforms become more and more important. I will be the first to admit, though, that obviously I am burdened by a community management perspective and experience on this issue. And so, what happens with these types of stories… Because they happen. I mentioned Justin.tv, this is the Facebook story, there have been many others, there will be many more, unfortunately, and when this happens, you often have a lot of speculation about how this sort of thing works, how moderation works, how quick people should respond, from people who really have no sense of what it means to scale moderation of UGC and have it be this large and this much content.
14:00 Patrick O’Keefe: It’s like anything else. You know, there are stories, things happen, and some people will have some sort of insight or insight into that, and know how it works to manage those situations, and some don’t. But again, I sympathize with Facebook and sort of the PR side of this, I guess, of saying that they’re working to get better. We’re all working to get better, but if you report a post on one of my communities, I don’t know if you’re going to get a response in 20 minutes, right? Of course, different expectations for Facebook, but I don’t know. I’m always interested in what the expectations of the public and general people have for the work that we do, because obviously, most people don’t do this.
14:33 Heather Merrick: Yeah. And I think it’s also important to keep in mind that it’s almost more subjective than you would even think, in terms of what people think really crosses the line. Obviously, murdering people, I think we can all agree, terrible, right? It’s pretty black and white.
14:52 Patrick O’Keefe: Technically, it’s not in my guidelines. Technically, it’s not in my guidelines. I’m just kidding, no.
14:56 Heather Merrick: Yeah. I think there are some commandments about it. It’s not good. But there are also people who feel really strongly about bad words or saying things bad about the religion that they’re a part of, and those are less black and white, and that gets reported all of the time, too, or just in general, someone feeling like someone else was mean to them. You really kind of end up being in the position of sort of police, sort of… I don’t want to say parent, but kind of parent, also. So, it’s a really complicated role, and it’s really not as clear as it may seem from the outside, if you’re not part of this weird world of content moderation.
15:37 Patrick O’Keefe: Yeah. And you probably saw, I would guess, even more of this while you were at Tumblr.
15:42 Heather Merrick: Yes.
15:43 Patrick O’Keefe: A fair number of reports.
15:44 Heather Merrick: But Tumblr has a trust and safety team, and that was a separate team from the team that I was on. So, really, it was them who deal with that, and they are some tough people over there.
15:55 Patrick O’Keefe: Let’s move off murder. [Laughs] So, tell me about the Airtime superusers group.
15:59 Heather Merrick: Yeah. So, because we’re new, this iteration of the product has been around since last spring. We have a small superusers group, maybe 40 people, and a lot of them are people who have been around since last summer. And over that time, we’ve gotten to know them. We’ve recently started to do group hangout chats in the app with them and actually showcase one of the features of the app, which is that you can watch videos from YouTube together. So, we’ve been watching YouTube videos and kind of following along with them, and trying to make it sort of like a fun little game time with our superusers. And we also have started doing that in a larger room within the app called Airtime HQ, which I almost hesitate to mention, because it contains a live stream of our office, which points almost directly at my desk. So, if you tune in there, you will see me working every day.
16:57 Heather Merrick: So, the superusers we’ve also gotten to know through phone calls. We’ve teamed up with user insights teams to help gather feedback from them, sometimes around a specific theme or feature that we’re really thinking a lot about that month. But in general, sort of trying to get to know them as people, as well. And it’s been really fun, and I feel like those efforts help people feel really connected to the people who work at Airtime as people, and not just this weird entity that made this app.
17:26 Patrick O’Keefe: You have, obviously, a platform for communication at the end of the day, and when you’re working with those superusers, is there an effort to… I don’t know, ask them to go somewhere else to send you information? Like, for example, superuser groups might often have, I don’t know, surveys. Or, you know, it might have some sort of test conversation. You mentioned phone calls, for example. But is it working in the app where you might send them a video survey and have them respond through video and then you have to kind of transcribe that? Or are you trying to push them elsewhere, or is it mostly in the app that you’re kind of getting this feedback?
17:55 Heather Merrick: We have sent them some surveys. Google has a pretty decent survey option. So, we’ve done that, and we’ve also just done email, as well, to the people who we know are in these groups, asking them to, again, let us know their availability, we’ll give them a call, we’ll talk about it. And we’re also lucky enough to have a pretty nice budget, which is not something that all community teams have, but the community team at Airtime is part of the marketing team, and we have some flexibility there in terms of what we do, so we can incentivize people to chat with us if we feel like it’s a really important set of conversations and we want a certain number of people to talk to. That can definitely help if we throw in an Amazon gift card for a 15-minute call with us. So, we’ve done some of that, and I’m trying to think of what these things are called. But basically, it’s like connecting the dots in terms of creating a bigger picture over time.
18:51 Heather Merrick: That’s one of the odd things about community work, is that a lot of this stuff doesn’t happen immediately. It does take a period of months to really figure out the connections between different members of your community. So, for example, in our superusers group, it took us a little bit of time, but after we talked to people, we realized that some of them knew each other previously from Periscope.
19:11 Patrick O’Keefe: Oh, yeah.
19:12 Heather Merrick: And they had actually gone to some sort of Periscope meet-up, some of them. And then there’s a larger group of them that has never met in the real world, and they wanted to find a platform that could help them have this more in-real-life, real-time interaction. So, they came over to Airtime, and they’re, like, hanging out on Airtime, but as people who met through Periscope. And I think that’s really interesting, but that’s one of those things where it’s not immediately obvious. You kind of have to spend time talking to people and gathering information from them, and yeah.
19:46 Patrick O’Keefe: The supergroup users, are they interacting with each other? Is there a way to facilitate on the platform where they’re talking to one another, or if it’s mostly with you and with Airtime?
19:56 Heather Merrick: Yeah. They are a little bit. I think when we really started to connect the dots, it was like, hey, these guys feel really comfortable with each other compared to everyone else. It was like, oh, they already kind of knew each other from Periscope. Got it. But yeah, there are some people who started to chat with each other. I think we mostly end up having conversations about updates to the app, and that’s not super exciting. But somehow, some of them get excited about it, and that’s when they pipe up and start chatting with each other, like, “Oh, yeah. I found this new thing. It’s been great. I do this with it. Oh, yeah. I do this, talking to my friend who’s overseas,” or whatever.
20:30 Patrick O’Keefe: What is a superuser on the platform? Is it based upon how much they use it, how many…? I don’t know if there’s some sort of friend mechanism on there, but how many friends they have, or is there some sort of metric you watch?
20:40 Heather Merrick: Yeah. So, it’s pretty fuzzy. There’s no real definition on it at the moment. But basically, if they’ve been using the app for a consistent amount of time, that’s one great indicator, and we can also see data like number of posts, or just hours of video, or something like that. That type of data is really helpful, of course, for seeing, like, you’ve not just been opening the app and starting at nothing; you’ve been interacting with other people. So, yeah. I think over time, that’s the most important part, is you’ve continuously reopened the app. I would say, you know, once a week, it’s good. Multiple times a week, that’s what we like to see.
21:20 Patrick O’Keefe: When you were manager of community management of Tumblr, the company removed replies on their platform.
21:26 Heather Merrick: Yeah.
21:27 Patrick O’Keefe: First of all, I know what replies are, right? I’m not stupid, so I know the word “replies.” But in the context of Tumblr, what were replies, why were they removed, and what happened afterward?
21:36 Heather Merrick: Yeah. So, it was part of a revamp to that feature which, my understanding is it wasn’t a really widely known or widely used feature. But, as the team discovered, after it was removed, a few people felt really passionate about it. And in a way, what you can sometimes see, temporary passions ignite around, “Oh, you changed the font here,” and people would be really upset for, like, a day. This was like, no, people really felt like we had robbed them of something over a period of weeks. And so, I think when that started becoming clear, the community team really kicked into action, in terms of gathering that feedback, logging it, making sure that people’s voices were heard in that way. Of course, we were always responding to people. But really kind of going the extra mile to make sure it was logged, which is important.
22:29 Heather Merrick: I think now, more than several years ago, when I sort of got into this world, community is about data, and data is your friend, and it can really make or break what you want to do, in terms of helping your community. So, arming yourselves with data. Talk to people within the company about, “Hey, we really need to have a strategy for communicating what’s going on here.” Because the end result was going to be great, and I think we knew that, but I don’t think users really knew that. So, eventually, we came to a point, and really, the credit goes to my boss, Elizabeth Tobey, where we got the okay to let people know a little more clearly what was going on, and people responded really well to that. And I think, you know, it’s a lesson in openness, really. It’s always going to be your friend.
23:14 Patrick O’Keefe: So, replies in Tumblr, were those comments on the post itself? What are replies exactly? Because there’s a lot of ways to interact on Tumblr.
23:20 Heather Merrick: Yeah, there are. It’s like a comment, yeah.
23:24 Patrick O’Keefe: So, comments were just gone, kind of? I mean, because there are other ways they could interact, right?
23:29 Heather Merrick: Right. So, there’s reblogging on Tumblr. That’s sort of the main mechanism of Tumblr and what it’s known for, and you can reblog and sort of add to the conversation in that way. And now, those things, replies and reblogging, are all sort of together in one area of the interface, where they weren’t before, and that was the change that was happening at the time.
23:53 Patrick O’Keefe: So, was the plan to always bring that functionality back, or was it really in response to the community?
23:59 Heather Merrick: Now I’m like, my mind is fuzzy. It was so long ago. Different lifetime.
24:04 Patrick O’Keefe: Well, reading the post, reading the announcement post, you get this sense—and that might be spin, in a sense, I don’t know—that it was more or less coming back and it was just down because they were doing some back-end retooling.
24:15 Heather Merrick: Yeah. So, there was sort of this “pardon our dust” post, and that was the first moment where we communicated that this change was happening. And yeah, it was meant to be sort of a reconstruction project. But I think the thing that makes people really uneasy is when they don’t understand timelines. And you can say, “Hey, we’re going to bring this thing back. Sit tight.” And they’re like, “Okay, cool, but for how long? Because this is really impacting the way I interact with the other people that I really connect with on this platform and my fans and friends.” We found out a lot of odd edge cases where people were using it to create stories collaboratively and that kind of thing, which is really neat, but not something that we’d anticipated. And so, those people were really upset, and they needed to know, “Okay. Coming back when?”
00:25:04 Patrick O’Keefe: Yeah, ambiguity is not usually the friend of community. [Laughs]
25:08 Heather Merrick: No. Yeah, understandably so. I mean, I’m super impatient as a person, and that would drive me crazy, so I think it’s reasonable to expect that it would drive other people crazy, too. Yeah, just being open. But at the same time, in the world of engineering, it’s so hard to nail down an exact date for when something’s going to happen because something unexpected will always arise, so I also completely understand the need to have somewhat ambiguous timelines. I think if you were to say, “Hey, we’re embarking on this gigantic project and we guarantee it will be live by July 3rd,” or whatever, you’re going to let people down, and so I don’t really think that that’s the answer, either. There’s somewhere in between, where you can sort of be like, “Hey, this is going to take us a few months and we promise we’ll give you updates along the way, so stay tuned. Two weeks from today, we’ll let you know.” That kind of thing. It could be considered over communicating.
26:05 Heather Merrick: I’m sure there’s plenty of people who do not feel the sort of impatience and anxiety that people like me feel, where they would find that helpful. They would probably just find it annoying. But I think serving, even if it’s a minority of people who really want that information, it can’t really hurt, you know?
26:21 Patrick O’Keefe: No, not at all. I’m a big believer in over-communication, to your point, in my personal relationships. [Laughs]
26:27 Heather Merrick: Yeah!
26:28 Patrick O’Keefe: So, I mean, I want all the people who rely on me, care about me, love me, whatever, to know how I feel at that particular moment about something or that impacts them, whether that be travel plans or something they did or something I did, or a way they may feel about something I did. Like, let’s just talk and you’ll know how I feel, I’ll know how you feel, and it’s always better than… “Always” is strong, but it’s usually better than ambiguity. But, yeah, over-communication – I’m with you.
26:54 Heather Merrick: Yeah.
26:55 Patrick O’Keefe: Before the show, you told me that you think a lot about security, specifically in the context of website hacking and how people feel an approach that… Obviously, we have a lot of internet companies, web services, I don’t know, credit card machines at the local retail store getting hacked all the time. Things are getting hacked. From a community perspective, I think that there’s interesting things about this. One of the things I think about first—and feel free to take it in a different direction—is that I feel like hacking used to be worse, being hacked used to be a lot worse than it is now, and that doesn’t sound great, but it’s almost like… I don’t want to say people are conditioned. I think they’re more aware, as I think you feel as well, but I think that hacking is not the complete, unbelievable, scary, unknown, “what has happened to my life?” moment that it might have been a decade ago, where now it seems like if you get hacked and you put in a reasonable effort, the main thing you’re judged on is—again, I guess to our last discussion—how well or poorly you communicate.
27:56 Heather Merrick: Yeah. I mean, I think the fact that you’re essentially conditioned to feel like hacking is okay and a normal thing, that’s not good.
28:05 Patrick O’Keefe: [Laughs] No!
28:06 Heather Merrick: This is not a good moment in history for us people who use our email and password to log in to every single thing in our lives. It’s not great, and I don’t know if you’ve been up on the Unroll.me news, but that cropped up, I think, since we chatted. Unroll.me was a service that had access to your Gmail inbox and they were selling anonymized data to companies like Uber for various reasons. I had that service on my old email address and I really don’t remember there being a popup telling me, like, “Hey, this free service is free, and that’s great, but also, we’re going to sell your soul to whoever we feel like selling it to.” That’s not cool. So, this is a really common issue in our time and I think it’s easy to get desensitized to it because it’s happening so much.
29:03 Heather Merrick: I was checking this morning to make sure I had the numbers right on the Yahoo! hacking situation from I think it was late last year, and it was billion, one billion. I was like, oh, yeah, it is billion with a “b.” That’s bad. So, yeah, I think in terms of community… Again, in a way, it connects to the over-communication thing. There is no way that it’s reasonable to expect someone to actually read your terms of service when they’re signing up, especially if it’s not in any sort of human, readable language. I think there is a definite responsibility that companies have to make it clear to you, okay, yes, this is a free service. Don’t just leave it to my imagination how you’re making your money off my data; I want to know. Something during signup that says, “This is what’s going to happen to your data and this is exactly why we’re asking for access to your inbox in this way.”
30:00 Heather Merrick: So, people who feel cool with that can say, “Okay, yeah.” And maybe people who don’t feel so great about that can say, “No, I’d rather pay you ten dollars a month instead,” or something like that. But people aren’t really offering that option, and I think subscription services are great and companies could really benefit from considering them as an alternative to practices like data selling.
30:24 Patrick O’Keefe: You know, we’re verging into ethics, right?
30:26 Heather Merrick: Yeah.
30:26 Patrick O’Keefe: When you talk about Unroll.me, you’re verging into ethics, and it reminds me of another story that just came out yesterday. New York Times did a story, “Uber’s CEO Plays with Fire,” and it was about how Uber was tracking people after they had uninstalled the app, and it kind of centers around a meeting between Tim Cook, the Apple CEO, who basically told him to knock it off or they were going to take the app out of the app store, which would substantially harm Uber’s business. Just casting aside all other issues Uber has had in the last, I don’t know, six months, a year, the never-ending news cycle of Uber, what’s scary to me about this is that these cases are high profile, there is consequence, but, like in his case, he was able to build a successful business using those tactics, right, and he’s at a point now where Uber is probably going to continue to be successful regardless. Maybe they’ll fail. Maybe they will. I don’t think they’re going to die tomorrow, let’s say.
31:17 Patrick O’Keefe: The thing about unethical actions—what I tell people—is it’s just not worth getting caught, because if you get caught the consequences are strong. But I know people in the community space, I’ve known people—I’ve cut ties with them—who do unethical things, and for every person who gets caught, there’s all these people who are getting away with these kind of tactics until they make it, so to speak…
31:35 Heather Merrick: Mm-hmm.
31:36 Patrick O’Keefe: …if they even switch it off when they make it. I mean, Unroll.me’s been around for a long time. Uber is very successful. That’s what’s difficult to me, is that… Like, I hear about local businesses in the area I live, which is Kitty Hawk, North Carolina, who might be skirting wage or labor laws by paying their employees in a certain way, right? I hear about things like that.
31:53 Heather Merrick: Yes. Right.
31:54 Patrick O’Keefe: The consequence for that… There is a consequence, but the reality is that there are so many people getting away with it. So, I don’t know that I have a question there or a point. It’s just very frustrating because I, within the community space at least, always push kind of this hyper-ethical agenda because I feel like the consequence of getting caught is so great that a lot of people will not survive. Yes, some of these people survive because they are so big already, but if you get caught as someone who is smaller—i.e. most of us—then your consequence could very well be death.
32:25 Heather Merrick: Right. I think, to your example of small businesses, there is actually a big story in The New York Times a while back about the labor laws around nail salons in the city, and it was a big exposé on various things that those businesses were doing to get around labor laws, and pretty terrible stuff going down there, and the result of that story and that exposé was that there were actual legal reforms made and a lot of those businesses were checked into, potentially shut down. I don’t know any numbers around that, but there was an actual civic reaction to that story, and I think that it’s good when a company like Uber or Unroll.me has a story coming out around it. They’re the big guys and that’s why it’s a story, but there are definitely lots of other people who are doing this and I think they could also be learning a lesson from seeing the result of the exposure of these practices. The fact that there’s a lot of businesses doing the same stuff doesn’t mean it’s cool, by any means. [Laughs]
33:32 Patrick O’Keefe: Yeah, I think that’s a great point. I think when you have those big stories, it dissuades some people. You know, it may not dissuade everyone. It may not dissuade a lot of people, but it does definitely dissuade some people, which, you know, to draw a parallel to community is why I think it is important that… I don’t believe in the line of community professionals. [Laughs] Like, I’m not here standing to defend community professionals at every end. I also think we have a responsibility to users to defend users as well, or members or whatever, the community, and when someone does something wrong in the community space, I have called people out and criticized people, and I think if we can’t criticize our own industry, and hopefully in doing so dissuade other people from doing the same thing, then I think we’ve got a problem. I think we’re always inviting outside scrutiny if we cannot look at ourselves and say, “Okay, well, honestly, we’re not all great. We’re not all… I’m not in some sort of kumbaya community land,” right? I love this space. I love this industry. I’ve been in it for a long time.
34:26 Patrick O’Keefe: But if you see a community professional doing something wrong, I don’t know if you need to tell them or do anything, but make a note, remember, and if something comes up where you have the opportunity to speak out against those sort of activities and you have a voice, I always encourage people to take advantage of their voice. And the downside is people are always like, “Well, it’s not my responsibility. It could cost me this opportunity or that job,” and I respect that. But as much as possible, I think that there has to be some sort of self-accountability in this industry.
34:53 Heather Merrick: Yeah, I think that goes for sort of the tech industry in general. I have a feeling… I don’t have any specific examples to support this necessarily, but I have a feeling that ethics and technology is becoming a topic that is actually resonating with more people and something that will start to impact the way that people interact with the organizations that they’re working for, and potentially that will be more of a factor in the companies that they choose to work for. I feel like I’ve seen more articles around this topic in general, ethics and tech, and I do think that you do need to interrogate the practices of your own industry and the people around you. It’s like, if you didn’t care, you wouldn’t say anything at all, but I care about this space, I care about technology, I care about community, so I say something. I feel like it’s an expression of love for me to be critical, even if it doesn’t get interpreted that way hardly ever. That’s how I like to think of it, for sure.
35:59 Patrick O’Keefe: I am 1000% there with you. Heather, thank you so much for coming on the show and for sharing your experience with us.
36:06 Heather Merrick: Thank you! It was fun.
36:08 Patrick O’Keefe: We’ve been talking with Heather Merrick, community experience manager at Airtime. She runs a silly newsletter—her words, not mine—called Heather. You can find a link to it on her website, at heathermerrick.com. She’s also heatheremerrick on Twitter. Heather, letter E, M-E-R-R-I-C-K. For the transcript from this episode, plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad, and we’ll see you next time.
Thank you for listening to Community Signal.