“Everything in moderation — including moderation”, said that old wag Oscar Wilde.

In May this year, Facebook announced it was hiring 3,000 extra staff to moderate its content and flag up inappropriate stuff. This was on top of the 4,500 it already had — an immoderate number of moderators, we think you’ll agree.

When a tech giant like Facebook makes a move like this, it’s a pretty good sign that when it comes to the tricky task of choosing good content from bad, humans know best.

Making judgement calls about content often comes down to context and intuition, and right now, computers aren’t up to the job. Would you trust a robot to referee a football match?

This might change. Facebook itself is ploughing millions into AI. But until then, nothing beats a good ol’ fashioned flesh-and-blood mortal.

Reevoo review moderation
We moderate around 4 million reviews (like these) every year. They're all looked at by native language speakers.

We think the same.

At Reevoo, every customer review we collect — about four million a year — is moderated by a real person before it goes live. So are all the consumer questions and answers we facilitate, and the amazing stories and photos we collect through our Experiences engine.

Our 34 moderators speak 20 languages and cover all 43 countries we currently operate in.

They read each review, and if they see something that breaches our guidelines, we send it back for the reviewer to edit and resubmit.

Around 160k reviews are rejected each year, for a number of reasons including swearing, specific mentions of price, personal information and so on. If a review is rejected, it’s always sent back to the reviewer, who gets a chance to edit and resubmit.

We’ll NEVER reject a review just because it’s bad.

We pride ourselves on delivering honest, relevant and appropriate UGC for our clients. Human moderation is an essential part of that.

We’re not alone…

You might have heard about Norwegian broadcaster NRK.

They’re the ones that are making readers answer a question related to the story they’re reading before they’re allowed to make a comment.

The strategy is designed to make the below-the-line debate – which often descends into detritus – a little more informed and considered.

Not only will people need to know what they’re talking about, but they’ll also have a few seconds to calm down and think about what they’re doing.

There’s a common theme here.

Nobody is questioning the value of this kind of content. Comments under articles, Facebook posts, consumer reviews – they’re all evidence that user-generated content is an essential and helpful part of life online.

We just need to make sure that the people contributing that content actually know what they’re talking about.

That’s why we place so much importance on our review collection methodology.

But we’re keeping our options open…

We’re keeping a close eye on AI and natural language processing, and there’s every possibility we’ll give that job (or part of it) to the robots if they can prove they can do it better.

But our 34 moderators do an amazing job deciphering the many shades of grey that are painted by our lovely reviewers.

For now, nothing beats a real set of eyes.

Subscribers get our content first Plus exclusive stuff and plenty of good vibes.
Give it a try — it only takes a click to unsubscribe.
Why humans are kings of content moderation