Community Moderation // Putting away the big stick

In 2020, social media seems more divisive than ever.

And this is creating even greater challenges for moderators trying to steer their brands through various social media minefields.

Businesses of all shapes and sizes need to take social media moderation more seriously. But that doesn’t mean you need to be merciless with the ban button.

Tim Hanslow is Head of Social/Community at Preface Social Media in Tasmania and has a decade of experience in helping brands large and small to build and moderate online communities. In that time, he has seen social media evolve – and the art of community moderation along with it.

“The broad advice I give is for people to focus on what is it they’re trying to achieve – and what things will get in the way of that,” Hanslow explains. “Who is the community we’re talking to? What conversations will we be having? Do you need to be actively moderating the things that you don’t want to happen, or do you need to push things towards what you do want to happen?”

Not as easy as some might think

Hanslow believes many businesses still see managing social media as an easy job that can be entrusted to the least experienced (or youngest) member of the team. “There’s this vision that social media has no value because it feels very accessible. It seems so simple because everyone can go in and click a button. Everyone can have a Facebook page. Everyone can comment on things all the time. How hard could this be?”

He calls this “invisible expertise”. When a brand like McDonalds posts a meme, it looks very easy to replicate. But what most people don’t see behind the meme is the person with a comms degree who drew upon years of experience and deep knowledge to create a piece of content capable of achieving the maximum cut-through.

Put away the big stick

Hanslow doesn’t see community moderation as primarily about “bringing out a stick and smacking people”. Instead, he suggests brands should be present and active on their pages to demonstrate the right behaviour, as most people will naturally adjust their behaviour to conform with the rest of the community.

“Otherwise, it becomes really easy for someone else to dominate the conversation. You can see articles that will have a sarcastic comment at the top, and it’ll have 500 likes and reactions to it. A bunch of those could be someone saying this is untrue or that’s racist or hateful or misogynistic, but it still rises to the top.” As a result, the conversation may no longer be about the article itself or the topic the moderator intended, but about however this commenter framed the discussion instead.

By being present and active, the moderator can guide how a discussion develops by making the first comment and/or replying to other comments deserving of being highlighted.

Assessing the risk

Some businesses might believe their social media spaces are relatively benign and therefore unlikely to give them any major headaches. However, Hanslow warns against complacency.

In 2020, social media seems more divisive than ever – and this is creating even greater challenges for moderators trying to steer their brands through various social media minefields; from COVID-19 to Black Lives Matter. When even a tea bag brand has to defend itself from being associated with racists, no business is completely immune.

 “In social media, it’s not necessarily the people who are in your store or who might become a customer. Facebook has nearly two and a half billion people using it actively per month. The barrier of entry to interact with your page or your brand is low. If someone wants to be a troll, they can troll 50 brands in one night with zero effort.”

Hanslow agrees that moderators need to be aware about far more than just what happens on their own pages, constantly alert for potential threats so that they can respond appropriately should the issue turn up in a conversation on the brand’s Facebook page. “Social media has brought community managers and social media people to this point where [they] need to keep an eye on everything. What are all of the issues happening right now? It’s being across all of the specifics and how quickly things change in turn.”

In short, moderators have to think like devil’s advocates, willing to imagine the worst-case scenarios so as to understand the potential risk to the brand. “There are two types of risk,” explains Hanslow. “How likely is this to happen? Maybe it’s not. But then the other aspect of risk is how bad could this get if it does go wrong. And I think that’s what a lot of people don’t consider in social media.”

This constant vigilance for negative comments and worst-case scenarios can take its toll on a moderator over time.

What are your tips for community moderation? We’d love to hear from you in the comments.

Community Moderation // Putting away the big stick
Jonathan Crossfield
Jonathan Crossfield describes himself as a storyteller because freelance writer, editor, content strategist, digital marketer, journalist, copywriter, consultant, trainer, speaker and blogger wouldn’t fit neatly on a business card. His regular column for CCO magazine, published by the Content Marketing Institute, would be better described as a series of angry rants fuelled by too much caffeine. Somehow, Jonathan has won awards for his writing on digital marketing, but they were so long ago it seems boastful to keep mentioning them in bios. He lives in the Blue Mountains near Sydney, Australia, with a patient wife and an impatient cat.