Elizabeth Koenig has a degree in cultural anthropology. She’s also an account manager at The Social Element (formerly Emoderation), where she manages teams of moderators and community engagement specialists that scale based upon client needs. We talk about how cultural anthropology applies to online communities. Plus:
- What happens when companies rely on automated moderation too much
- How to motivate community pros to invest in client communities when they don’t choose the clients
- Why The Social Element, a company powered by a remote workforce, has a strong workplace community
“Data science and anthropology had this baby, and it’s called sentiment analysis or emotional artificial intelligence. Instead of hiring an anthropologist to read through an entire community’s worth of data or every post or reaction to a new story and then writing a report on it, we have the power now with data science to use sentiment analysis to get an emotional report on how people react to certain stories within certain time frames. That technology is really interesting, because it works the way a filter works but with some artificial intelligence involved, because it attributes words to emotions, and then it keeps on learning from there. For example, if somebody says something like, ‘This makes me happy,’ the sentiment analysis would see the word happy and attribute that to a positive emotional reaction. That allows for brands or agencies like The Social Element to analyze huge amounts of data and to get a snapshot of the emotional reaction that people may have, which is like what anthropological field work is.
“But the thing about it is that it’s 70% accurate, which is pretty good; but you still need someone to take that and make it meaningful for the group that is interested in that data, whoever they may be. The combination of using the sentiment analysis stuff that’s coming out now, with someone who understands some basic social science procedures, can create a really powerful snapshot, a look at how community is feeling hour by hour, even. It could be real-time; it could be over the period of an entire election cycle.” -@ElizKoenig
“My favorite example [of what can go wrong if you just apply data without analysis] is C-3PO. I’m a huge Star Wars nerd. Of course, C-3PO is a droid [and] his whole primary purpose is to interpret languages. He’s just a language robot, but he has trouble with human emotion. That is the exact issue that we run into with any kind of artificial intelligence right now. I think that the sentiment analysis really struggles with sarcasm. It’s not going to get it right all the time. You have to have somebody on the ground keeping the process in check, doing quality assurance using real-time examples from the actual community. You could even interview people in a community, potentially, to get perspective on what’s really going on. I think it’s important to keep things human. These are tools that are here to assist us. They’re very powerful.” -@ElizKoenig
“People who rely 100% on automated moderation are, right now and, I feel, in the foreseeable future, doomed to mediocre performance when it comes to moderation. I think they’re doomed to mediocrity, which means mediocre communities, mediocre results from communities, and this role, this task, this discipline, being viewed as underperforming, because they’re putting in the effort that leads to mediocrity. It’s almost a self-fulfilling principle with people that go all in 100% on automation.” -@patrickokeefe
“A lot of people view moderation jobs as pretty low on the totem pole, when the reality is that that’s what makes everything work.” -@ElizKoenig
About Elizabeth Koenig
Elizabeth Koenig was born and raised in Columbia, South Carolina and decided she loved the internet through AOL Beanie Baby forums and making her own websites documenting email chain letters. Elizabeth went to college and got a Bachelor of the Arts degree in cultural anthropology and wrote her thesis about college student use of Facebook and how it was affecting the culture of being in college (this was when Facebook was new).
Since then, Elizabeth worked in recycling and waste management (by choice!) but ended up working for Mind Candy/Moshi Monsters on the side, which led her to Emoderation, who recently rebranded as The Social Element, where she has found a home and a career for her passion in digital culture as a moderator, project manager and now account manager.
Elizabeth has a hilarious dog, who is usually under her desk at all times. Her long time boyfriend is a blacksmith (he makes kitchen knives) and they love to go hiking and traveling.
- Wikipedia page for The Twilight Zone, a TV series created by Rod Serling
- Elizabeth on LinkedIn
- Moshi Monsters (from Mind Candy), where Elizabeth was first hired to work in online communities
- The Social Element, formerly Emoderation, where Elizabeth is an account manager
- Oxford’s definition of anthropology
- Urban Dictionary, a crowdsourced dictionary of slang words and phrases
- Community Signal episode with Jessamyn West of MetaFilter
- Nextdoor, a private social network for those in your local neighborhood
- Second Life, a virtual world
- Wikipedia page for C-3PO, a robot from Star Wars who interprets languages
- “Facebook Will Add 3,000 Moderators After Video Killings” by Colin Lecher for The Verge
- Amazon Mechanical Turk, a “marketplace for work” which some have used for moderation-related tasks
- musical.ly, a video community known for performance videos tied to music
- Community Signal episode with Brian Pontarelli of Inversoft
- Jigsaw, an incubator within Google-parent Alphabet, that “builds technology to tackle osome of the toughest global security challenges facing the world today,” some of which are tied to online communities and moderation, including their collaboration with The New York Times
- The Social Element’s Sue John (project manager), Tamara Littleton (CEO), Kate Hartley (COO of subsidiary Polpeo) and Ashley Cooksley (chief sales officer), all of which Patrick has connected with in person in different locations
00:04 You’re listening to Community Signal, the podcast for online community professionals. Tweet as you listen using #communitysignal. Here’s your host, Patrick O’Keefe.
00:20 Patrick O’Keefe: Hello. You just crossed over into The Twilight Zone… I mean Community Signal, and I am your host, Patrick O’Keefe. We’re talking with Elizabeth Koenig and taking a cultural anthropology approach to community, plus the impact that automated moderation is having on the community career path, and building a strong workplace community for remote teams. Elizabeth was born and raised in Columbia, South Carolina, my neighbor to the south, and decided she loved the internet through AOL Beanie Baby forums and making her own website documenting email chain letters. Elizabeth went to college and got a Bachelor of the Arts degree in cultural anthropology, and wrote her thesis about college student use of Facebook and how it was affecting the culture of being in college. This was when Facebook was new.
01:01 Patrick O’Keefe: Since then, Elizabeth worked in recycling and waste management by choice but ended up working for Mind Candy/Moshi Monsters on the side, which led her to Emoderation—which recently rebranded as The Social Element—where she has found a home and a career for her passion in digital culture as a moderator, project manager, and now account manager. In that role, she is responsible for the agency’s relationship with the clients assigned to her, including the management of multiple teams of moderators and community engagement specialists that scale based upon the needs of the client. Elizabeth has a hilarious dog who’s usually under her desk at all times. Her long-time boyfriend is a blacksmith. He makes kitchen knives. They love to go hiking and traveling. Elizabeth, welcome to the show.
01:39 Elizabeth Koenig: Thanks. I’m so happy to be here.
01:41 Patrick O’Keefe: It’s a pleasure to have you. Gosh, I know so many people over at The Social Element, formerly Emoderation, our first sponsor here on the podcast. So many people I like and respect, and so, I’m happy to have yet another one join us here on Community Signal.
01:55 Elizabeth Koenig: Yeah. I like and respect a lot of those people, too.
01:58 Patrick O’Keefe: That’s good, because even if you didn’t like them, this would not be the time to say it. Right?
02:03 Elizabeth Koenig: Yeah. Right.
02:04 Patrick O’Keefe: So, you have a degree in cultural anthropology. Anthropology is the study of human societies and cultures, and their development, per Oxford. I can’t take credit for that. Let’s talk about the application of cultural anthropology to online community. Where do we begin?
02:21 Elizabeth Koenig: It’s a really big place to start, but I think I can narrow it down pretty quickly.
02:25 Patrick O’Keefe: Great.
02:25 Elizabeth Koenig: Most of the people who listen to the show definitely work in community or are a part of hopefully, some sort of community. Cultural anthropology is just the study of different communities. And then within online communities, it’s studied the exact same way. Online communities have social norms and collective memories just like offline communities do. I think one of my favorite things to talk about with online communities that has a crossover with offline communities is that they form their own languages, which I’m sure you’ve seen. I know as the moderator, and anyone who works online, occasionally you have to Google a meme or some sort of a term. I end up on Urban Dictionary sometimes, because I have no idea what it means. That’s because an online community has invented a new meaning for something that I’m just not privy to yet. It’s just a great example of online communities at work and how linguistics is an anthropological part of studying online communities.
03:28 Patrick O’Keefe: It is. I had Jessamyn West from MetaFilter on the show somewhat recently, and as I recall, I think she talked about how they have invented different terminology within the community. It’s pretty common. When you think about different great cultures, great civilizations of the past, of history, they were built in person. People were seeing each other; they were doing these things. They may not even have had a great means of communication other than putting something on paper, or parchment, or writing it down. We have cultures that we’re building that are strictly digital, where most of the people, I would hazard a guess, most people in most online communities will never meet one another, will never meet someone from that community; not to say many don’t; I have. I’ve met people from online communities. They’re some of my best friends. But communities are large; there’s a lot of people in them. They know the people through the computer. Do you think that makes the culture that’s being developed less… I don’t know… historically significant or less…
04:28 Patrick O’Keefe: I don’t want to say important, but there’s a difference there…but is it just the changing of times? Is this where we’re at now, that this is how society’s cultures, uprisings, political movements, this is how it’s done; and from this point forward, we’ll either improve upon this or be this, but we won’t go back? It won’t be like it was when we talk about great civilizations of the past? Is it fair to think of great online communities in the context of great… I don’t know… maybe too much grandeur here, great civilizations?
04:57 Elizabeth Koenig: It’s all a matter of perspective. It’s hard to quantify this community is the greatest community ever. We can’t do that historically, although we may have our preferences. I think the same thing about online communities. There are online communities that have saved people’s lives, and to those people, that online community is going to be the greatest community that ever existed. I think that we definitely can’t go backwards from here. I think that the way online community formed and changes is much faster than traditional offline communities are able to evolve. If you look at Twitter cliques, their jokes and the things that go on with those groups, it changes by the hour. So, I don’t think it makes anything maybe less historically significant but collectively, it’s a new way of doing culture for everybody.
05:51 Patrick O’Keefe: Right. And we see this push for people who live next door to each other. There’s a nextdoor.com, the social network for neighborhoods. People want to go online. It might just show you who your racist neighbor is, or it might show you that person is wonderful; but one way or another, even though they could just knock on next door and say, “Oh, yeah. This happened,” there is still this push to go online and to communicate with one another online even though you’re in close proximity.
06:14 Elizabeth Koenig: Yeah. That definitely could be viewed as the downside. There are upsides to online community, because you can connect with people all over the world about topics that you may not have realized other people have interest in. But the Nextdoor app is a great example of an isolationist perspective of what can happen if you don’t actually see people face-to-face, or have a backyard barbecue, or whatever.
06:36 Patrick O’Keefe: We can’t just have an avatar barbecue with our avatars and fake beef?
06:43 Elizabeth Koenig: It’s sort of like Second Life kind of stuff.
06:45 Patrick O’Keefe: Exactly.
06:47 Elizabeth Koenig: Some anthropologists look at that as a reaction to a society focused on individuals rather than on the whole.
06:55 Patrick O’Keefe: So, when you were studying this in college, where did you want to take it, or was it just something that you were fascinated by?
07:03 Elizabeth Koenig: I had an amazing professor who changed my perspective on the world, basically. I just had one of those eye-opening moments of, wow, I’d been thinking that everything a different way, and now I can see things in a better way. I didn’t really know what I was going to do with it, but I’ve always been a total nerd on the internet. I really enjoy internet culture. What inspired me to work on Facebook was I noticed that in college we were sitting around on Facebook in groups. We would all be on laptops in one room together on Facebook, and it was like, “Why aren’t we just hanging out?” Because Facebook was so interesting, we could interact with so many more people all at once, and it’s very addicting, and all these other elements. I was interested in that. I had no idea where it would take me. I ended up in this industry… First it was a side hustle, and then it evolved at The Social Element as this career where I get to have an insider perspective on some online communities.
08:03 Elizabeth Koenig: Every day when I come to work, I know what communities that I’m working with, or what brands and the communities around those brands, what’s going on with them every day, and what I need to watch out for. So, it has evolved in a really interesting way. I’m really happy about it.
08:19 Patrick O’Keefe: You told me before the show your thesis and a friend in the industry is what got you into this door of working in online community. Was there a moment working when it was a side hustle, then it became a career, where it clicked, where it was like, “Well, I’ve kind of been on this path for a while. The things I studied are relevant to the work that I do”? Was there a moment for you, or is it just you just got there?
08:39 Elizabeth Koenig: In my very first interview for the job with Mind Candy, they wanted to see what I wrote about Facebook. So, it was part of my application process, but it really made sense to me. Because I’d been sitting on the internet researching things and just reading since 1995 or ‘96, so, it does feel really natural for me to continue being interested and engaged in what is going on day-to-day, what are people talking about, where are they coming from, what’s going on on the other side of the world, even. It’s definitely a non-traditional path for someone who studied cultural anthropology.
09:20 Patrick O’Keefe: You told me before the show that looking at community through that same lens, “Online communities are a manifestation of culture and create their own cultures. So, how we research that and the data coming out of it is important. Now that agencies and AI are getting involved and those metrics are valuable, how can we ensure that communities are being researched in the right way and that we’re getting the perspective right?” How do we do that?
09:43 Elizabeth Koenig: There are a couple ways, but I do think that with the emergence of emotional data, which is what social listening is becoming. Basically, data science and anthropology had this baby, and it’s called sentiment analysis or emotional artificial intelligence. Instead of hiring an anthropologist to read through an entire community’s worth of data, or every post or reaction to a new story and then writing a report on it, we have the power now with data science to use sentiment analysis to get an emotional report on how people react to certain stories within certain time frames. That technology is really interesting, because it works the way a filter works but with some artificial intelligence involved, because it attributes words to emotions, and then it keeps on learning from there.
10:39 Elizabeth Koenig: So, for example, if somebody says something like, “This makes me happy,” the sentiment analysis would see the word happy and attribute that to a positive emotional reaction. That allows for brands or agencies like The Social Element to analyze huge amounts of data and to get a snapshot of the emotional reaction that people may have, which is like what anthropological field work is. But the thing about it is that it’s 70% accurate, which is pretty good; but you still need someone to take that and make it meaningful for the group that is interested in that data, whoever they may be. The combination of using the sentiment analysis stuff that’s coming out now with someone who understands some basic social science procedures can create a really powerful snapshot, a look at how community is feeling hour by hour, even.
11:41 Elizabeth Koenig: It could be real-time; it could be over the period of an entire election cycle. I saw… it was like a slide on the deck from a conference. It was a data science conference in Boston a little while ago. It analyzed the emoji usage between the dates of a feud between Taylor Swift and Kanye West. It was analyzing the emojis for each celebrity based on this controversy and how many snakes Taylor Swift got over this time period vs. how many hearts that Kanye West got. So, you can apply those to so many things. It’s really fascinating.
12:22 Patrick O’Keefe: What do you think is the downside, in that you said “the right way,” and you said someone that can help sort the data out? What’s the downside of just applying the data? What goes wrong?
12:31 Elizabeth Koenig: I think my favorite example is C-3PO. I’m a huge Star Wars nerd. Of course, C-3PO is a droid, a robot, that his whole primary purpose is to interpret languages. He’s just a language robot, but he has trouble with human emotion. I think that that is the exact issue that we run into with any kind of artificial intelligence right now. I think that the sentiment analysis really struggles with sarcasm. It’s not going to get it right all the time. You have to have somebody on the ground keeping the process in check, doing quality assurance using real-time examples from the actual community. You could even interview people in a community, potentially, to get perspective on what’s really going on. I think it’s important to keep things human. These are tools that are here to assist us. They’re very powerful, and they’re amazing, but they need to be used in a context of just usable data.
13:33 Patrick O’Keefe: So, the downside is really getting it wrong?
13:35 Elizabeth Koenig: Basically, yeah.
13:36 Patrick O’Keefe: That’s the downside, getting it wrong. Because 70% is good but it’s not great. I mean, 70%, there’s a lot wrong there. There’s a lot of things that can go wrong with the other 30 that can swing the overall emotion or the sentiment. That 30% can push you down below a majority, or it could push you up into a majority. You mentioned automated moderation. Another thing you told me was that, “The basic moderation job is quickly being phased out, and the work is now being done with AI and filters. The industry around moderation is changing so fast. I’m extremely lucky I was able to get my foot in the door and build a career related to it before the door closed.” Do you really think that the door is closed, that it is now much harder or near impossible for someone to break into this industry as you did through a part-time moderator role?
14:22 Elizabeth Koenig: I think it’s much more difficult than it was. You know what’s really interesting about that 70% statistic is that even human moderators they get it right about 80% of the time, because we’re always going to argue about what something means, even among people. In terms of getting a moderation job, I think it was hard when I broke into it. I think it’s extremely difficult now. There are a lot of companies that are offshoring moderation work, and there are a lot of companies using artificial intelligence that for really sensitive communities I think is very dangerous. I think that we definitely still have markets for marginalized communities, children’s communities, definitely; but in general, human moderation is…
15:08 Patrick O’Keefe: In some ways, it sounds like the door for reasonable paying jobs is closed or closing. When Facebook says they’re going to hire 3,000 moderators, I think we all look around at each other, and the presumption is that those people won’t be well paid but will go to a Mechanical Turk-style moderation and/or people in countries where the cost is much lower.
15:29 Elizabeth Koenig: Yeah. Then you have to question: Are they being paid a living wage? Do they have resources available to them if they’re dealing with really sensitive content?
15:37 Patrick O’Keefe: Self-care; are they being properly trained? Are they just bodies in a chair with a yes/no button and a general sense of what the community is, but really no understanding of the nuance of this role?
15:48 Elizabeth Koenig: Yeah. It’s not to say that people everywhere aren’t capable of the work; it’s just making sure that people are just paid… People need to be taken care of. It needs to be done the right way.
16:00 Patrick O’Keefe: Yeah. I’d be interested to learn more about those 3,000 moderators. I think they said it was joining already another 2,500 or 3,000, so, that’s 5,000 to 6,000. We could be talking about the largest team of online moderators in the history of this work. It will be interesting to see how that goes, how could we look back at that 10, 20 years from now. There’s already articles about the dark side of moderation, the underbelly of the internet, the people who toil away in the dark room, look at the things that no one wants to see. I wonder if that will be a scandal one day 10, 20 years from now, that will be the scandal of underpaid workers, poorly paid workers, unfair wages, this is the next job—online community moderation—that will be looked at in that light? I hope not, but I also would not be shocked.
16:49 Elizabeth Koenig: Yeah. It’s a lot of content.
16:52 Patrick O’Keefe: No. It is a lot of content.
16:53 Elizabeth Koenig: It’s just so much content and so much bad content. But yeah, agreed, it will be interesting to see how it plays out.
17:00 Patrick O’Keefe: Sticking with AI moderation, you told me that… and you kind of got into this but… “What haunts you right now is the very quick changeover from human moderation to automated and AI-based moderation, not only from a business perspective,” which we talked about. “It is tough to see the entire industry be swallowed up 1984-style…” It’s a rosy picture… “By machines. But if the technology isn’t good enough yet to be able to catch cries for help, or potential suicide, or self-harm content, especially on more at-risk communities involving children and teenagers, or particularly vulnerable groups that don’t self-moderate.” As someone, speaking for myself, who loves filtering tech, loves AI and machine learning, and what they can do, this is a concern I share, because I recognize what they cannot do. The overreliance of it is, as you’ve alluded to, troubling, the ability to simply cut people and staff as a line item for these companies for these issues. It’s just too tempting for some to just erase that budget and throw it behind some sort of machine learning or filtering tech. The counterargument is then well, this is better than nothing at all. We can’t afford moderators, they say.
18:01 Patrick O’Keefe: But people who rely 100% on automated moderation are right now and, I feel, in the foreseeable future, doomed to mediocre performance when it comes to moderation. I think they’re doomed to mediocrity, which means mediocre communities, mediocre results from communities, and this role, this task, this discipline, being viewed as underperforming, because they’re putting in the effort that leads to mediocrity. It’s almost a self-fulfilling principle with people that go all in 100% on automation.
18:33 Elizabeth Koenig: Yeah. I understand the cost perspective on that. I get that. But people in communities and even… It’s not even communities; it’s in general. They know when there’s automated moderation. So, I think that when that happens, if somebody wants to get around the filter… maybe when artificial intelligence gets better, this won’t happen as much… but if they know there’s a filter, if they know that that’s automated, they will try to get around it, and they usually can. So, it defeats the whole idea. Like you were saying, I think that the biggest one, it’s really communities for children, communities where there are underage people participating, there’s not room for those kind of errors. You could really end up with a huge liability. Those kids deserve more. They deserve to be able to interact in a safe place.
19:25 Patrick O’Keefe: Yes. 70%, 80% in that scenario is not good. It’s just not going to work. I know it scales differently when we’re talking about Facebook and the amount of content they have, the amount of content some of your clients certainly have vs. the amount of content I have or a community that… I don’t know… gets 500, 1,000, 10,000 contributions a month, 25,000 contributions a month for smaller communities and for communities that are not massive, which is most. Most brand community, most organizations, most companies don’t have a community that’s just pouring in thousands of posts every second, thousands of posts every day. They get a more manageable amount they can deal with. In those cases, even 80% is not great. Because if you think about it, that’s 2 out of every 10, 20 out of every 100, 200 out of every 1,000 are not handled right. That adds up in turning people away. When you go strictly on automation, you’re pushing the percentage down. You’re not pushing it up; you’re pushing it down.
20:24 Patrick O’Keefe: There’s a cost to that for businesses who want customers. So, it seems like the most awkward sentence ever. There’s a cost to that. We all want customers; we all want people. Because to us it’s just… On a scale of Facebook, 10 million, 20 million, maybe they don’t care so much. We have to use Facebook. They’re so dominant. They’re this preferred communications tool. Facebook Messenger is creeping into our phones. There’s this feeling we have to use it. But most of us, no one has to use us, for the most part, the rest of us. They have other options. They can go somewhere else, and they will. So, to not invest in this and say, okay, well, let’s say we’re going to cut moderators, and maybe we mess up on, let’s say, 10%. Someone gets a post removed or contribution, photo, any kind of UGC—video, music video, or Musical.ly, like the Musical.ly app, people sing along—one thing gets wrong, removed improperly, that person may never be back again.
21:14 Patrick O’Keefe: They may just get angry, go away, you lose them forever. There’s an acceptable margin of error. But those people that you give the error to, that you hand the error sandwich to, they’re going to take a bite, and they might just leave. You might never see them again. So, I don’t know… the propensity to do things that push the percentages down is bad for that reason. We should always be striving to push the percentages upward.
21:36 Elizabeth Koenig: Yeah. I think that there are some uses of automation that are great. Basically using it to flag content for human moderation is an incredible use of automation. When you’re dealing with a huge amount of content, that is an incredibly useful thing. It’s a powerful way to handle a lot of content. But, yeah, that human element, it’s just not replaceable, at least right now.
22:00 Patrick O’Keefe: Oh, yeah. I don’t think anyone should misconstrue this conversation, your part or mine, as some sort of indictment about automated moderation, or filtering tech, or anything like that. I love it. I get this sense you really like the potential of it and the things that it does well. We’ve had the CEO of Inversoft, Brian Pontarelli, on this show. They have a filtering solution that’s pretty accessible to people budget-wise. So, I’m a big fan of the stuff they’re doing and that other companies are doing like Jigsaw, which is part of Google, the work they’re doing with The New York Times. There’s a lot of great potential out there for these things. I think what’s being criticized here today in our conversation is just the overreliance on it. We’ve talked about this on this show before, but it’s been awhile, the idea that these tools they make the job of community professionals better or they have to to be more efficient for the organization. That’s just it; that’s the whole thing.
22:51 Patrick O’Keefe: So, if you’re lowering your success rate by cutting people, which you will be, that’s not a good look; but if you are taking over menial tasks, repetitive tasks, things that a human doesn’t necessarily need to look at, then that’s excellent; then you’re raising the efficiency level, you’re getting more efficient budget-wise. The key thing is that you are making it a better experience for the community users, for the people who are actually using the community. You’re giving them quicker decisions; you’re giving them accurate decisions; and that’s the key. So, being critical of something, we’re not being critical of something wholly here; we’re just that individual use, overusing it, I think, is the problem.
23:22 Elizabeth Koenig: Yeah. Using it as a one-stop solution does not work…
23:26 Patrick O’Keefe: It’s not a crutch.
23:28 Elizabeth Koenig: …but as a powerful tool to help, absolutely.
23:31 Patrick O’Keefe: You manage teams of moderators and community engagement specialists that scale, and they work with multiple brands. You mentioned to me how people working at the brand, people working in-house at that company, they can have more of a flag caring mentality by default, whereas, when you work for an agency, and you work with all these different clients… and you don’t get to choose those clients necessarily; they’re handed to you… it can be challenging to get these teams to invest in the communities.
23:55 Elizabeth Koenig: Yes.
23:56 Patrick O’Keefe: How do you get those people who jump from client to client to invest?
24:01 Elizabeth Koenig: It’s tough. I come from a moderation background, and I jumped from client to client for years. It was a Christian dating website, to popsicles, to who knows, all kinds of things. I think that instead of putting the onus on the moderator to be a flag carrier for that specific brand or company, you have to change gears a little bit and put that onus on being a flag carrier for your own agency. The Social Element, I will toot my own horn about them forever, working for them. You have to cultivate your own community culture at work so that your moderators have a sense of pride to represent you. I think that that is probably one of the biggest ways to get a moderation team that’s working for multiple clients to really be proud of their own work.
24:55 Elizabeth Koenig: It’s the same thing with really giving positive feedback and reinforcing that individual person’s own work. Just being proud of your own work and being excited about doing a good job personally is the other part to it. Some brands and some clients they may not ever be invested in, because it’s just not them; but they will always be invested in themselves, hopefully, and doing the best job that they can do; and then also for the agencies that they work for, being proud and doing a good job for the agency. So, I think that’s the best way to motivate people in that arena.
25:33 Patrick O’Keefe: So, encouraging people and rewarding success for the agency and for the individual; the idea that we may move around from a lot of clients, but this level of success you are having overall at all these clients represents success for the agency, which is success for all of us, it’s success for you. It reminds me of something I said when I spoke at CNN years ago. There was different people there, different groups, talent, on-air talent, and behind the scenes people. My theme to them was let’s say you don’t want to interact online, or you don’t like doing that with viewers. Okay, let’s just say that’s true. It’s a CNN branded thing; and it’s CNN stuff; and why am I doing this? So, here’s the answer: get selfish. It’s you. You’re building a community around you right now, around the work you do, that you will take wherever you go. When you leave CNN and you go somewhere else, the people who like you because they interacted with you will stick with you. They won’t just go to the next person in line, the next anchor, the next journalist, the next writer; they love you because you took the effort.
26:32 Patrick O’Keefe: So, you bring them with you wherever you go. I think the same is true in these sorts of situations, where it’s important to give that person, as you suggested and said, something to hang their hat on, that they are achieving these things, and these things are their successes regardless of anything else, regardless of the client they worked for, regardless of the time frame, how long they were there, regardless of if they leave one day, taking that pride in their work is something that they can hang their hat on and have forever, for as long as they’re in this industry, for as long as they are a professional, they’ll always have that success, that great work, that moment in time, that recognition. So, that makes a lot of sense.
27:09 Elizabeth Koenig: Yeah. Agreed.
27:10 Patrick O’Keefe: The Social Element is a remote agency. I’ve had dinner with project manager Sue John in Orlando; had a drink with CEO Tamara Littleton and subsidiary company Polpeo COO Kate Hartley in Austin when they were down for South by Southwest; I had lunch with chief sales officer Ashley Cooksley in Wilmington on my way down for a wedding. You’re spread around. You raved about the workplace community that you have at The Social Element to me privately, and you just hinted at it, also, on the show without my prompting or you really having any reason to do so. They weren’t standing there. They’re not going to read our conversation. But you did it anyway. Even with your distributed workforce, you talked about this strong camaraderie that you have, in your words, “that even some physical offices strive to have.” What have they done to have this remote group of workers be such a strong community?
27:59 Elizabeth Koenig: I don’t know. I thought about this for a really long time, Patrick. There’s so many different parts to this puzzle.
28:06 Patrick O’Keefe: What drew you in? What made you be a part of that community? What makes you love being there?
28:11 Elizabeth Koenig: I think the caliber of people that I work with, it’s amazing. The Social Element has hired groups of people that I look forward to talking to every day. I wake up and I think, “Oh, my gosh. I can’t wait to talk to so-and-so about work,” which is crazy. I’m not a workaholic kind of person. I like to work; I enjoy my work; but I just really like hanging out with the people that I work with even though we’re virtual. So, I think it starts at the top with HR probably, just making sure that you’ve got a really great team that work well together, and that everyone has an understanding of how everyone else works, which I think is really hard, especially in a virtual office, to figure that out. The other part to it is that I think that everybody feels really appreciated, for the most part. People understand we go through waves of work. Sometimes it’s super busy; sometimes it’s not as busy.
29:09 Elizabeth Koenig: But no matter what’s going on, everybody feels this… like I said earlier… that camaraderie. We feel we’re in it together. We’ve got to move this machine together. Nobody’s part is bigger or smaller than anyone else’s. The moderators on the ground are just as important as everybody in management trying to get everything going. I don’t know… just an appreciation of the work being done from all levels. It makes me really happy. I enjoy it.
29:39 Patrick O’Keefe: I once thought to myself… I think I’ve said this… “Appreciation is a routine.” Being appreciative of people in your life, of the people around you, and reminding yourself to always do that, and make it a part of your daily routine that you’re appreciative, and saying “thank you” to people. That comes from when I was a moderator on a community probably ‘98 to 2000. It wasn’t my community; I was just a member that was invited to moderate there. I was one of the most active members. I was a “senior moderator,” two of us, I think, on a team of like 10 or 12. I think the person that ran the community thanked me twice in like two years for the things that I did. That always stuck with me, because it’s so easy to say “thank you” and yet you don’t do it. It’s not that hard. So, do you think it’s a proactive thing there where you can thank people generally; you can thank the group for the work they did; you can thank people specifically as individuals in private and public in front of the group or directly through a private message? Is there an element of that that you feel is… I don’t know… unique or something that they do at The Social Element that helps to drill that home?
30:44 Elizabeth Koenig: Again, I think it starts from the top. Tamara has always been so gracious to everyone, even… I remember thinking about The Social Element. Right now, I’m feeling so good about them, but…
30:56 Patrick O’Keefe: That’s great.
30:57 Elizabeth Koenig: … it’s an amazing place to work. A lot of people view moderation jobs as pretty low on the totem pole when the reality is that that’s what makes everything work. That’s what’s actually getting done. That’s what people are buying. Community management is the same kind of thing. Just being appreciative and respecting that work as much as you respect any other work, and being gracious about it, having that from all the way at the top to understanding that our culture of our company, that’s how we roll. Everyone is important. Everybody’s work is important; so, we all have to treat each other that way. So, yeah, definitely, being gracious and appreciative, it’ll change the way you think about your life. It makes work really fun.
31:43 Patrick O’Keefe: Elizabeth, I am happy that you have a great workplace. There are probably some people who are listening who are envious of that quality of the workplace. But I am grateful that you came on this show and shared your experiences with us.
31:57 Elizabeth Koenig: Yeah. Thanks for having me. That has been so cool. I’m a fan.
32:00 Patrick O’Keefe: Well, I appreciate that. Thanks so much for listening to the show.
32:03 Elizabeth Koenig: Sure. Thanks for having me.
32:05 Patrick O’Keefe: We have been talking with Elizabeth Koenig, account manager at The Social Element. Find them at thesocialelement.agency and find Elizabeth on LinkedIn at linkedin.com/in/ekoenig. That’s E-K-O-E-N-I-G. For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad. See you next week.
Thank you for listening to Community Signal.