What does all that mean for a lowly tech startup hoping to create a community platform? You’re going to need a crash course in managing the trolls. Luckily, The Coral Project is here to help. It’s a journalism-oriented collaboration between Mozilla, The New York Times, and The Washington Post that’s designed to help outlets learn how to build a community, and they’ve recently published an article on MediaShift about managing abusive commentators.
Look at a New User’s First Few Comments
The Coral Project said: By cracking down on the first comments, you’ll be able to separate the trolls from the genuine articles immediately.
Get Users to Flag Abuse
While onboarding users, encourage them to report on any future bad behavior they might notice. Keep your messaging simple and clear in order to get the best results — the article suggests their own tool, Talk, in order to handle the process.
Highlight the Good Comments
Finding a way to prioritize the useful, constructive comments will guide the conversation away from abuse and those who rely on abusive comments and outrage in order to be heard.
Point Targeted Users to Support Networks
Create a resource list will help users who might otherwise feel alone and supported if they are singled out for online abuse.
Reply with Empathy
The first weapon a community editor should pull out? Empathy. Check out the rest of the project’s research for advice on more specific ways to handle community abuse, from abuse of a single user to a situation in which your own team is targeted, to ongoing abuse in general. If your community stands out as a healthy one in a world of increasing online abuse, you just might gain the edge you need to thrive. Read more advice on social media at TechCo