Moderation veteran Alison Michalk, CEO and founder of community management agency Quipp, joins me on this episode to talk about Facebook’s approach to moderation. Plus:
- Convincing executives to run companies like communities
- The legal climate for community builders in Australia
- How the music industry’s street team promotion model translated to the internet
“We’ve seen Facebook take a stance around, ‘We’re a platform, we’re a utility, we’re not in the business of being responsible for the content on our platform.’ That was always going to play out in a really dangerous way, I think.” -@alisonmichalk
“It would be very hard to argue that Facebook, by and large, is a respectful environment. Because their stance to me has been, ‘It’s on the end user. If you don’t like it, delete it. If you don’t like it, don’t look at it.’ As a community management practitioner, that’s not a great approach to take to building community. Of course, they probably haven’t really seen themselves as building community.” -@alisonmichalk
“One of the things that I’ve always loved about forums is the asynchronicity. You come on when you’re ready, you catch up on what’s happened, and you contribute when you feel like it. But if you look at a workplace, it’s about synchronicity, and being there nine to five. And the inefficiencies in that are massive.” -@alisonmichalk
About Alison Michalk
Alison Michalk is the CEO and founder of Quiip, a global leader in online community management and moderation services, and co-founder of community management conference Swarm. She has over 10 years of experience working with online communities.
Alison founded Quiip after identifying a need for online community management and social media services that minimized risk and maximized user engagement. She has built a world-class team of community managers who provide private, public and non-profit sector clients with 24/7 services.
In addition to founding Quiip in 2010 and Swarm in 2011, Alison established the Australian Community Managers group in 2008. It now has more than 2,500 members.
Alison’s experience building and managing online communities extends from start-ups to large corporations and government departments. Prior to launching Quiip, Alison led a team of 30 moderators managing over 180 forums as community manager for one of Australia’s largest online communities, Fairfax Digital’s Essential Baby.
When she’s not working, Alison enjoys out-of-city living on the Central Coast and raising two boys. She also likes meditating, coffee, playing Lexulous and thinking. A little too much.
In order of reference:
- Venessa Paech
- Community Signal episode with Venessa Paech
- Is Work a Place? Distributed Teams & the Future of Work presentation by Alison
- European Commission and IT Companies Announce Code of Conduct on Illegal Online Hate Speech
- Want to Know What Facebook Really Thinks of Journalists? Here’s What Happened When it Hired Some by Michael Nunez
- Facebook, YouTube, Twitter Sued for ‘Failure to Remove Homophobic Content’ by David Chazan
- Quiip on Twitter
- Alison on Twitter
00:04: Welcome to Community Signal, the podcast for online community professionals. Here’s your host, Patrick O’Keefe.
00:16 Patrick O’Keefe: Hello, this is Community Signal. I am Patrick O’Keefe, and thank you for joining us today. On this episode, we’re talking to Alison Michalk. Alison is the CEO and founder of Quiip, a global leader in online community management, and moderation services, and co-founder of community management conference Swarm. She has over 10 years of experience working with online communities. Alison founded Quiip after identifying a need for online community management, and social media services that minimize risk, and maximized user engagement. She has built a world class team of community managers who provide private, public, and nonprofit sector clients with 24/7 services.
00:49 Patrick O’Keefe: In addition to founding Quiip in 2010, and Swarm in 2011, Alison established the Australian Community Managers group in 2008. It now has more than 2,500 members. Alison’s experience building and managing online communities extends from startups, to large corporations, and government departments. Prior to launching Quiip, Alison led a team of 30 monitors, managing over 180 forums, as community manager for one of Australia’s largest online communities, Fairfax Digital’s Essential Baby.
01:15 Patrick O’Keefe: When she’s not working, Alison enjoys out of city living on the Central Coast, and raising two boys. She also likes meditating, coffee, playing Lexulous, and thinking a little too much. Alison, welcome to the program.
01:26 Alison Michalk: Great. Thank you for having me.
01:28 Patrick O’Keefe: It’s good to have you on. I had Venessa Paech on a while back, and you are both involved in a couple of projects together, Swarm being one of them. So I kinda like to space people out. I would’ve had you on sooner, but yeah, it’s great to finally get you on the program here.
01:41 Alison Michalk: Absolutely. Yes, Vanessa and I do a lot of stuff together here in Australia. It’s a small scene, very fortuitous that we’ve been able to work together over the years.
01:50 Patrick O’Keefe: Yeah, and I think highly of both of you, so I appreciate you sharing your knowledge here on the program with us. So, in the mid-2000s you were director of Australian Street Teams, which provided online and offline marketing services to record labels and artists. Now the idea of street teams, for those unfamiliar, was often to utilize passionate youth or people looking for an entry into the music industry, and have them literally hit the streets: Put up stickers and posters for particular artists, call local radio stations, and request their music, convince people to go to their shows and so on.
02:18 Patrick O’Keefe: Parts of the street team model translated well to the internet, and there were and are numerous services that try to capitalize on that. Being a huge Bad Boy Records fan myself, they were a label that really pushed street team efforts as part of their growth in the 90s. Later in the 2000s, I joined a website called i-Squad, and they would give you missions, like request a song on TRL. And if you don’t know what TRL is, it was an old show in the US on MTV called Total Request Live, and you requested music videos on it. It might seem like an alien concept now, but that was a thing back then. Or they’d ask you to call their radio station and you would receive points for doing so on this website, points that could be traded for merchandise.
02:54 Patrick O’Keefe: But then there were parts that felt spammy, like posting in online forums and message boards to promote the artist. And I was actually managing communities at that time and I was like, “Nope, not doing that. And you shouldn’t be doing that.” And I feel like this attitude, still permeates artist promotion today. I still get spam emails from artists, with MP3s attached, even though I haven’t written a music blog in years. When you were trying to bridge offline and online street team efforts, were the labels trying to push you into any ethical land mines? Did you find anything that gave you pause as you tried to bridge those two worlds?
03:24 Alison Michalk: Well, what a great question, and what a great explanation just of that scene and that era. For me, it was quite interesting ’cause it marked my foray into online. So I had a background working in music magazines, and this is early 2000s. And I saw the increase of internet penetration, and thought, “Magazines are dead.” We’re working with a six-week lead time, all of a sudden people have Hotmail addresses, and bands can contact fans directly. And I sort of predicted, “That’s it magazines are dead”, which is quite funny ’cause that magazine I started is still around a decade later.
03:56 Alison Michalk: But yeah, so for me I was quite excited about the opportunities there for bands and fans, and that was around the time that Myspace was getting more popular. Because I ran the company myself, and it was being done well overseas, but it wasn’t being done much in Australia at that time, I didn’t feel like I was pushed too much. And I worked with a lot of small record labels that I liked, so I had a lot of direction in terms of what we did and didn’t do. We did ring radio stations, and in Australia we have the Hottest 100, like on Saturday night which maybe it was 50, at that time. And people voted for the songs to be played. And I think we ended up having a lot of influence on that show and the show did later get canceled. And I always wonder if our efforts contributed to that, which makes me feel quite bad ’cause it was a great show on Triple J, which is an Australian government-run radio network here in Australia.
04:45 Alison Michalk: So yeah, nothing that I felt was too unethical at that time. I mean, we were more led by what fans felt like doing. We didn’t have the sophisticated point system, or anything like that. Having worked with bands for so long, I knew a lot of band members, and we’re able to get them to write emails to the fans and stuff. So it was really more about connecting fans, and connecting them to bands. And I guess a little bit more community focused than perhaps how some of the street team companies were run.
05:12 Patrick O’Keefe: Now, did you mostly work with Australian acts? Or did you have acts from, say the US? I noticed you had clients like Sony BMG, EMI, big global companies? Did they ask you to promote acts from other countries in Australia, too?
05:24 Alison Michalk: Yes, and I was always interested in pop punk sort of stuff, so we worked with Epitaph, and Fat Wreck, and a lot of those labels, and they had a lot of success in Australia off the back of working with street team companies. So yeah, kind of stuck to that genre. I stuck to music just that I was interested in promoting because I guess that also felt like aligned with my values as well. I wasn’t all of a sudden telling people that this great new hip hop artist was amazing, it was stuff that I was quite passionate about, so that worked.
05:54 Patrick O’Keefe: Now, the reason I asked that is, did you notice any difference say between those global companies, or companies contracting you from other countries, to the Australian based companies, as far as how they approach the internet, how they interacted with fans? Was it all the same or were there some differences between the two?
06:08 Alison Michalk: Well, I think the American market just is so much bigger than Australia. We have a population of 20 million here, we’re tiny. I think that, generally, just comparing any sorts of companies and startups, and you are just working with massive audiences in America compared to here. I think companies tend to have more funding, and they’re more sophisticated and there’s more at stake, and so we were very grassroots. It was me and a few of my friends sitting around a coffee table, hand-stuffing envelopes with stickers to send to people.
06:41 Patrick O’Keefe: Nothing wrong with that. I think people do that more now with all this swag that goes around, just don’t offer it as a job benefit. Swag’s not a job benefit.
06:50 Alison Michalk: Yes.
06:50 Patrick O’Keefe: I was reading a slide that you gave on remote workforce. It was about, “Perks are not culture.” I think was a line from it. It might of been someone else but someone at Quiip gave the presentation. But, I definitely agree with that, perks are not culture, even within a community where people participate for free, perks are not culture.
07:03 Alison Michalk: Absolutely, spot on.
07:05 Patrick O’Keefe: So, just the other day the European Commission, Facebook, Twitter, Google, and Microsoft, four horsemen, if you will, announced that they had all agreed on a code of conduct related to illegal hate speech in the EU. Most of it seems like pretty straightforward stuff, have community guidelines. I like to actually use the terms community guidelines, because that’s one I myself use a lot. Not rules, community guidelines, I go, “Hey, they use that term. That’s good.” But, have community guidelines, allow people to report violations, act on valid reports, educate people on what is okay and not okay, comply with national laws, and share best practices. Those seem like things we should be doing. But, did you have a chance to take a look at this story, and did anything about it, stand out to you from a moderation perspective?
07:46 Alison Michalk: It’s such an interesting topic, I’m not completely familiar with EU law in general so that’s a little hard for me to speak to some of that. But, I think what I’ve found so fascinating about this journey of platforms is people like yourself, and me, and I’m sure other community practitioners that’ve been doing it for a long time, have witnessed the lack of moderation tools. And, we’ve also seen Facebook take a stance around, “We’re a platform, we’re a utility, we’re not in the business of being responsible for the content on our platform.” And I guess that was always going to play out in a really dangerous way, I think. And I still question Facebook’s stance on some of the things. I notice one of the quotes in the article was that they want to give people the power to express themselves, while ensuring a respectful environment.
08:28 Alison Michalk: I think it would be very hard to argue that Facebook, by and large, is a respectful environment. Because their stance to me has been, “It’s on the end user. If you don’t like it, delete it. If you don’t like it, don’t look at it.” I think as a community management practitioner that’s not a great approach to take to building community. Of course, they probably haven’t really seen themselves as building community. Look, I welcome this approach, I think it’s fantastic and it makes sense, and like you say, it’s obvious, I don’t know how it’s going to be executed. And I do have concerns that when your platforms have been around for as long as they have and haven’t baked in community management and moderation principals from the get go, can you steer the ship? Can you turn that? I mean it’s a beast, it’s a beast of a ship for all of those companies.
09:16 Patrick O’Keefe: Yeah, it’s almost like co-opting the term “community”. Because, Facebook is a platform for X, right? It’s a platform for people to connect, it’s a platform for me to post that I listen to rap music. It’s not necessarily a community, unless you use it like a community, and you use it to connect with people. But, Facebook I guess as an organization is, as you alluded to, pretty ambivalent to the idea of community. It’s a communications platform and you use it how you wanna use it. And, as you pointed out, that’s not a comfortable position for a community professional to take, that it’s on the end user. That goes against the idea of community management in general, I think, and the discipline as a whole is that that’s not what we do. We create specific environments that cater to specific groups of people and that’s not what Facebook is. So, it’s interesting to hear them speak in those terms. Even reading through the announcement for this news, you can see it’s a carefully negotiated document, right?
10:05 Alison Michalk: Oh, absolutely.
10:06 Patrick O’Keefe: Everyone’s working on it to just have a little bit of leeway in there, to have an out-clause and everything. For example, one of the things that got a lot of attention was the 24 hour window for removing illegal hate speech. I’ll just read from it here. It says, “The participants are promising to quote, review the majority of valid notifications for removal of illegal hate speech in less than 24 hours, and remove or disable access to such content if necessary.” End quote. Again, it’s important to note the wording here, a majority, e.g., more than 50%, 50.1% of valid, valid notifications in less than 24 hours, and remove or disable that content if necessary. Again, majority, valid, if necessary. There’s some important qualifiers there, that at the end of the day, the platform will probably be the ones who decide what is valid.
10:55 Patrick O’Keefe: So, 50% of that valid, and then what’s necessary. And, I’m not saying that’s not how it should be, because I think 24 hours is a window that seems like a lot of time to people who don’t use the internet all the time. But, that’s tough. It’s good, I think that it’s not all notifications, it’s not all decisions within 24 hours. As that would seem to be a bit difficult, even for Facebook, because of how large they are. You can’t just print off people to throw at this. Do you think, and they can, of course they should, they can write algorithms for newsfeed, so they obviously have algorithms for spam and hate speech. And there’s companies out there that specialize in hate speech and grooming, and more complex issues than just spam. Filtering companies like Inversoft. Now, you work in moderation, moderation veteran, do you think that the 24 hour window, it plays well in the press, but given a scale of these services, do you think that’s overly burdensome?
11:48 Alison Michalk: Yeah, I failed to see how they’re going to execute this. Like you say, they can build complicated algorithms, but to this date, we know there’s always ways to get around moderation. There’s a certain naivety, around this stance, and like you say there’s multiple parts of that sentence alone, that gives them exits. And in fact, it could just be all talk, and they don’t have to live up to any promise because they get to say it’s not valid, it wasn’t necessary and so forth. And Facebook’s taken such a weird stance on so many things like breastfeeding images, and it’s got this one universal policy, one size fits all, which clearly, isn’t working at all. So I think that as an outsider, we know that moderation is not black and white. Like that’s the easy decision, and that’s what the algorithm picks up, and the grey part, which is a massive part in the middle, requires people to make judgment calls on content, and we’re talking about just unfathomable sort of volumes of content.
12:48 Patrick O’Keefe: Yeah. A lot of people think of the worse speech, think of just rampant racism, antisemitism, awful things. And this applies to hate speech so it’s important to put that out 24 hours quote illegal hate speech, so it’s how that is defined in the EU. And so of course they’ll make this easy, people will have to specifically choose hate speech, and I’m sure they already have that. So when people report, it’ll be like this person is in the EU. This person selected hate speech. This goes into the folder. This goes into the priority folder, these reports go here. If they chose anything else even if it’s hate speech, I would assume and I might be wrong, cause assumptions are dangerous, but I would assume that it will not go into that priority folder. That’s part of the social contract here. That’s part of the thing because they’re saying if we report it, they’re not saying “We’ll be proactive.” They’re not saying, “We’ll filter for this.” They’re not saying, “We’ll search for all slurs, and then view those, and be proactive.” That’s not what they’re saying. They’re saying that, “When we get reports we will act on them within 24 hours if they are valid, if they are illegal speech, if necessary.”
13:45 Patrick O’Keefe: So again that’s a lot of qualifiers. But it’s one of those things where, there is gonna be a line here for illegal hate speech. And Facebook will be the arbiter of that line. And that’s what we all are as community professionals, on some level, with the communities we manage is, we’re arbiters of the lines. And the big thing is to be consistent in those lines, and though Facebook might wanna avoid this definition because they caught a lot of flack, here in the US, for maybe editorializing trending stories, and possibly, going against conservative view points. And I don’t know, I’ve read the story, I don’t know it was a Gawker story, and I don’t know if it gave the full picture to be honest with you, but let’s say it’s true. Moderation is editorializing. Moderation is creating the narrative for this community, in the US anyway. You don’t have to remove racist speech, that’s not illegal to have that on your website, we have that freedom here in the US. It’s awful, I would never host it, I ban people when they do things like that, I don’t give them warnings. But that’s my community that’s my choice. So Facebook has to have a voice, I think and say, “This is our voice.” That’s what I think this is saying, and so we’ll see if they back it up.
14:44 Patrick O’Keefe: But moderation is creating the narrative for your community. We might wanna be soft about it, walk on eggshells, and try to define it in a different way and say, “Oh we allow free speech, and we’re just kind of weeding out the bad stuff.” But we’re making choices about the types of people that will be attracted to our community. And so Facebook might have to finally be harsher on this. And we’ll see if they are, take a stance and say, “You know what? We don’t want these people. If you’re an anti-semite, we don’t want you. If you say anything vaguely anti-semitic, if you say anything vaguely racist, if you are a homophobe, whatever it might be, we don’t want this.” And so they need to enforce those policies. Because it’s just like on Twitter, Twitter has had policies forever, the problem is they don’t enforce them.
15:24 Patrick O’Keefe: I don’t wanna slam Twitter because I think Twitter and Facebook have unique challenges. They’re dealing with a level of stuff… I have complete compassion for people who work in the abuse departments at Facebook and Twitter. I have so much compassion for those people because I know abuse is tough just in my little circle of the world, I can’t even imagine what they have to deal with.
15:39 Patrick O’Keefe: It’s tough because you have policies and they have to be enforced. And I think that’s where people are struggling is where they don’t have that line that they’re actively enforcing. And so they get people who wonder, “Why is this here, why is that not there?” There was a lawsuit from a Jewish student group in the EU I believe, and I don’t know if it’s prompted this, but it’s definitely being mentioned, alongside the story, where they actually sat and reported things for a month let’s say. And took note of what was removed and what wasn’t and said, “Well 90% of stuff wasn’t removed.” And that’s not to say that just because they reported that it should be removed because that’s a dangerous thing, but they actually had the quotes there and say like, these are holocaust deniers this is what they’re saying, “Do you allow this or do you not allow this?” So what happens to those reports? Facebook, I don’t know, at best maybe they’re letting things fall through the cracks, but I’d love to see them step up and have that voice and be that narrative.
16:26 Alison Michalk: And I think that’s quite frightening, in many ways, historically looking at Facebook’s stance on a lot of things. I know in Australia alone we’ve had a lot of problems around, gendered violence online, and male rights activists, and I’m sure this is something that’s happening globally, that I don’t like to…
16:43 Patrick O’Keefe: Those delightful individuals [chuckle]
16:45 Alison Michalk: To speak on a global level but we’ve got some feminist activists online, here that have had their own content banned on Facebook when they’ve re-shared and said, “This is the sort of violence that I’m copying.” And the original content was okay by Facebook but wasn’t okay when it was shared by these women. So there’s been some really kinda frightening precedence set. And as you say, moderation is editorializing. And your community is shaped by thousands, and in this case, millions or billions of interactions and it does require a judgment call. This is the direction that it has to go in, but I do think it’s slightly scary in a way, given Facebook’s position on some of these topics in the past as to where are things gonna fall, because there is a lot of power in doing that. And it’s a massive massive undertaking, and I’m not really sure the four horsemen… [chuckle] They must to an extent be aware of what they’re up against. Surely?
17:47 Alison Michalk: But it’s a huge undertaking and it’ll be interesting to see how it plays out. We’ve long spoken about the lack of moderation tools on these platforms, and forums that have been around for decades and decades, their moderation functionality is so much more sophisticated than that of Facebook’s. And that’s been one of my grievances, is Facebook’s taking its position, like, “It’s not on us, it’s on you.” And it’s like, “Okay, well that’s fine.” But maybe give us some tools to work with because they are extremely limited. You look at just the profanity filter alone, no one even knows what low medium and high entails, it’s just a guess.
18:23 Alison Michalk: I mean, that is the most rudimentary, profanity filter ever, and that’s it. We just have so little at our disposal, and I think they really need to ramp that up. Of course, that’s never been in their interest because it’s not revenue generating for them to care whether or not some lowly community manager on Facebook has moderation functionality, but it has to scale, doesn’t it? And I have seen the introduction of more tools. I think they’ve introduced moderators, mainly recently in groups.
18:49 Patrick O’Keefe: Yeah, Facebook has… And I’ve said this before, and it’s still true now, they don’t have the mod tools of a forum software in 2000. Literally, what I had in phpBB 1, I had more then, than you get in Facebook groups right now. It’s ridiculous, and it does speak to their goals. When you speak of concerns about how Facebook has handled things, and then what this could mean, are you concerned about freedom of speech or are you just concerned about how they’ve chosen to apply their rules in the past?
19:11 Alison Michalk: I think both. We don’t have freedom of speech here in Australia, so we do have to already, moderate, to discrimination and defamatory content. So in some regards, we’re more closely aligned to the EU there. I just am concerned about them as a private entity making calls around… Like this stance that they’ve taken on people not being able to show breastfeeding photos. I mean, it’s kind of outrageous to me, the level of sexist content on there. And I assume the hate speech laws cover sexism, but I noticed they seem to focus a lot on racism, and xenophobia, so I’m not really sure about that.
19:51 Patrick O’Keefe: Yeah, I do my part in that. My mom was a doula, working in childbirth for years, and I’ve always been taught breastfeeding is a beautiful, natural thing. It’s a good thing, and so I share things on Facebook, and I get some comments here or there from people, like, “Why are you sharing this? Stop that. I find that disgusting.” I’m like, “No it’s not. No it’s not.”
20:08 Alison Michalk: Wow, yeah. And that’s the thing, it’s so impactful, it’s scaled, Facebook. I mean, the power…
20:13 Patrick O’Keefe: Breastfeeding is impactful at scale. [chuckle] But no, I get it, and it’s because of their power that I think this concern is real. Because of how important they are for everyone, as a communications platform. Well, not everyone, but a lot of us, a lot of people in this space. So let’s stick with that. You talked a little bit about how you don’t have freedom of speech in Australia. And at Quiip, one of your key areas is risk assessment and mitigation. So what is the legal climate like for communities, and community builders in Australia, as far as liability for what’s posted on their community, as far as free speech, and as far as just legal challenges and concerns in general?
20:48 Alison Michalk: Sure. Yeah, well there’s a lot of consumer and media law that our team need to be across, but that said, I do think… We’re working, increasingly, with government departments doing online communities, which I think’s really bold, and innovative, and progressive. And for four years, we worked on, it was a social media campaign but it was aimed at reducing domestic violence against women, and it was focusing on having discussion with young people around respectful relationships. So we were talking about dating, bullying, sex, relationships with parents, things that were happening at school. All these sorts of really complicated topics, but that was a federal government campaign, and they were happy to dive into that space.
21:26 Alison Michalk: So yeah, it all comes down to having the right client and having a client that we talk about being risk prepared, not risk adverse, so I think it’s really just about having those really robust frameworks in place. We worked for a number of years with Reach Out, which is a suicide prevention organization. So we worked in their forums, and again, really challenging, and the dangerous content basically, highly sensitive content. But again, it’s just about having a framework, and a procedure to follow. So like anywhere, there’s laws and there’s content that you need to look out for. But in a lot of ways, I think… I mean, I can’t speak to moderating in the US, but I think knowing that you’re allowed to not tolerate that content, it doesn’t come down to an ethical decision, it’s like, we will not have content that is of a discriminatory, or defamatory nature, and so forth.
22:13 Patrick O’Keefe: Right. So in Australia then, if someone gets sued because of content on a community, who gets sued?
22:18 Alison Michalk: Well, interestingly, there aren’t many cases in Australia. There have been cases where the internet service provider, has been gone after, so that’s an interesting one.
22:29 Patrick O’Keefe: So the internet service provider… The person posts the bad content, and they have an internet connection through X, and X is the one being sued, not the host of the content or where they posted the content.
22:38 Alison Michalk: Yeah, so there was one case where it was like, they were allowing this company to host a forum, so they were targeted. But there have been cases of individuals being targeted as well and being sued for content that they posted online. But again, we don’t have a lot of case studies here yet in Australia, so everyone’s sort of cautiously, wondering who’s gonna be the canary down the mine, so to speak. [chuckle]
23:02 Patrick O’Keefe: Oh, so because you don’t have that test case, you have sort of an air of caution, where people don’t wanna be that test case, even if they’re the ones hosting the content? Like in the US, we have something called Section 230 of the Communications Decency Act. What that says is that basically, we’re not liable for what’s on our community, unless we’re a part of it or we induce it. Like, we have the discretion to moderate and say, “This is okay, this is not okay.” And then not be held liable for those decisions. Of course, there are some exceptions, like it’s not a full-on blanket protection for idiocy, but we do have protections, where I’m very fortunate that we do have. So it sounds like in Australia, like the EU, those protections aren’t as strong for community managers and hosts, or maybe they haven’t been tested yet, either way. And so there is, again, that desire not to be the one who gets tested.
23:48 Alison Michalk: Yeah, and I think there’s a flip side to that. It’s an air of caution, but it’s also a bit of an unknown and well, so far so good. [chuckle] I do think, and I don’t know how much I’m speaking out of my realm of expertise here, but to do with the internet service providers being responsible, have a feeling like piracy’s a part of like what’s driven that as well. So, if you’re hosting content where people can download TV shows and those sorts of things, they’re trying to make the internet service provider be responsible. So I think maybe piracy’s driven that more than say, online community content.
24:21 Patrick O’Keefe: The money. The money’s driven it. It’s always… That’s the thing, we get good protections and then it’s like the bad apples, these suckers, they come in and they ruin it for us. ‘Cause they take it to an extreme and then, “Oh, well, we need to legislate now.” And we all get caught in that brush. It’s always the bad people who make the responsible community professionals, “Hey.” And that’s just life. That’s life in general.
24:41 Alison Michalk: That’s true.
24:42 Patrick O’Keefe: You are trying to convince executives, and managers to run their company like a community. When you started Quiip, you didn’t have much traditional management experience but had a lot of experience managing communities. And so you applied those philosophies to your company. What is the key difference between traditional management of a business, and then treating it like a community?
25:01 Alison Michalk: Sure. Well, I think traditional management is very like the top-down hierarchy. Information flows from the top to the bottom and it’s a bit of command and control. So it’s, “Do as I say.” And I think that other styles of management are like that trust and track, which is a little bit more of what we use and everyone works from home, and everyone works very autonomously, and everyone works very collaboratively though. So I think sometimes, people think we have these 20 plus people just sitting in their offices isolated, but we have the benefit of all being community managers who actually know how to communicate very well online, and we know how to work as part of a team. I once read something about community management. It spoke about leading from behind and I really liked that idea. It’s sort of, you do have to get your community going in the direction they want but ultimately it is a user-led community.
25:51 Alison Michalk: We all know that companies build a community with this objective, but the community has a completely different idea about what they wanna do or what they wanna get from the community. And I think like a company can be like that to an extent, so it’s sort of working together in saying what projects they’re interested in, or how they think we can improve the company, or what we can do. So we do work very well as a team and it is more of that networked community approach, and I think that’s something that is the feedback that we’re getting to. It’s not just a dream in my mind. But again, we’re very lucky that we are all community managers so I think that it’s been a lot easier to implement than perhaps a company in a different industry. But yeah, that’s sort of what it means to me.
26:34 Patrick O’Keefe: And leading from behind puts you in the best position to kick people in the butt, when you’re already there, it’s easy. I don’t know why I thought of that when you said that. Yeah, it’s like, “Hey, there you go.” But I know you and Venessa are both working on this and talking to different companies and consulting based upon this. What’s the reservations? Are you being met with some reservations? Some concerns? Some road blocks? What are the things you’re hitting as you explain these concepts to people?
27:00 Alison Michalk: I think it all comes down to, trust is still the biggest thing. And I still get that when I talk about Quiip being a distributed team, and people like, “How do you know that people are doing work?” My take on that always is the risk actually is that they’re doing too much work, not that they’re doing too little. So when you’re at home, the risk is that they are staying on the computer a bit long or, “I’ll just finish this report tonight.” We all know what that’s like. Your computer’s there. There’s that great quote from the book from Jason Fried of Basecamp, where he says, “If you’re just worried about who’s bum’s on seats, you’re actually managing chairs, not people.” And I really like that. So it’s about having an output-based work model.
27:40 Alison Michalk: But yeah, so coming back to your question, I think people really just think, “Oh, how would that work? Like, if we change the whole dynamic of not telling people what to do?” I mean, shifting a traditional organization to being collaborative, and working like a community, it’s a big one. And I think that you mentioned it before, like perks are not culture, that’s sort of what we’re seeing where companies are like, “Let’s put in a ping pong table, and let’s put in a cafeteria.” And moving towards more that American startups type of environment.
28:10 Patrick O’Keefe: Us Americans. Our ridiculous startup culture.
28:15 Alison Michalk: And I think that there’s some dangers around, using that approach, because culture is either formed intentionally, or it is formed through neglect. And if you just throw a few ping pong tables in, and think, “Everyone’s gonna… This will all work out fine.” I also think that’s a very, I don’t wanna say male-centric, but lots of these companies that are leading these changes are, like the typical employee’s like a 25-year-old male software developer. It doesn’t really cater for a diverse workforce. Me speaking as a mother, I don’t wanna hang around, and play ping pong at 5:00 PM, I just want to get my job done, and go and pick my kids up from school. So I think sometimes…
28:56 Patrick O’Keefe: Or you wanna have the freedom to pick them up at school, and maybe work an hour or two later, or work earlier. It sounds like a lot of this is… And when you talk about trust it’s about, again, trusting people to get their work done because they want to, not because they have to, not because they have to do it within a set number of hours or a set time during the day, but because they are good at what they do. And so, giving them the freedom and the trust to say, “You know what, these are your tasks, get them done, get them done however you’re comfortable getting them done and collaborate with people.” And when it’s communities even though, for the most part when we do community we don’t pay people, it’s not a job, but they show up because they want to, they contribute because they get value from it. It’s self-fulfilling. And so, ideally work would be that way.
29:38 Alison Michalk: Absolutely. And one of the things that I’ve always loved about forums is the asynchronicity. That’s the thing, you come on when you’re ready and you catch up on what’s happened and you contribute when you feel like it, but if you look at a workplace, it’s about synchronicity, and being there nine to five. And the inefficiencies in that are massive. No one feels switched on for an eight-hour block, and then they switch off, that’s not even eight hours, now it’s like a 10 hour workday or whatever. Where with Quiip, we do exactly as you say. It’s like, “This is what we’re trying to achieve, do it when it suits you. I don’t care, I don’t need to know if you’re starting at 10 or 12 or 2, or having a sleep, or going to basketball practice, just deliver me the work.” And the efficiency is there, and productivity wins to me, are really obvious. And so wasteful, just to say, “Yeah, let’s work nine to five.”
30:23 Patrick O’Keefe: I agree. And if you come across any companies that need someone like me, who fit that model, you just let me know, Alison.
30:30 Alison Michalk: Well, absolutely.
30:32 Patrick O’Keefe: Well, it’s been a pleasure to have you on. Thank you for joining me on the show today.
30:35 Alison Michalk: Thank you so much for having me, it’s a total honor. I would usually have a million questions for you because I know your answers to all these things would be equally as fascinating. So I really appreciate the opportunity to chat with you Patrick, thank you.
30:47 Patrick O’Keefe: Thank you. We have been talking with Alison Michalk, CEO and Founder of Quiip at quiip.com.au, that’s Q-U-I-I-P dot com dot au. You can follow Quiip on Twitter @quiip, and follow Alison @alisonmichalk. For the transcript from this episode plus highlights and links that we mentioned, please visit communitysignal.com. Community Signal is produced by Karn Broad and we’ll see you next time.
Thank you for listening to Community Signal.