Before buying on Amazon, most people check reviews. In fact, before buying anything, most people check reviews. 

But what if you discovered that a growing number of reviews were actually written by bots?

Digiday reported recently that:

Multiple small companies report they’re seeing one-star reviews of unverified purchases on their pages that are written with bad grammar, coupled with remarks like, “Great product satisfaction guaranteed.”

This is obviously a bad thing for the sellers concerned, because it drives down their average star ratings. And a bad thing for Amazon, because it erodes trust in the site.

Who is behind the bots? It’s unclear. Competitors, in some cases. But in others, the truth could be murkier — Hillary Clinton’s recent memoir received hundreds of one-star reviews on Amazon within hours of its release.

All too human

Here’s another worrying thought. Amazon says it uses AI to detect the good reviews from the bad, saying “We use a machine-learned algorithm that gives more weight to newer, more helpful reviews”.

But as AI improves, so do the bots. They’re getting better and better at sounding authentic. Researchers at the University of Chicago have trained a neural network to churn out convincing fake reviews based on millions of Yelp reviews.

Here’s how it did with a restaurant in New York:

  1. “I love this place. I have been going here for years and it is a great place to hang out with friends and family. I love the food and service. I have never had a bad experience when I am there.”

  2. “I had the grilled veggie burger with fries!!!! Ohhhh and taste. Omgggg! Very flavorful! It was so delicious that I didn’t spell it!!”

  3. “My family and I are huge fans of this place. The staff is super nice and the food is great. The chicken is very good and the garlic sauce is perfect. Ice cream topped with fruit is delicious too. Highly recommended!”

Scary, huh?

As AI and “natural language processing” advances at dizzying speed, there may come a time in the not-too-distant future when it will be impossible to tell the humans apart from the bots.

And at that point, we won’t be able to trust any open review sites any more.

Incentivised reviews

The rise of the bots only adds to another long-standing problem on Amazon. Because reviews – and more of them – lead to more sales, it’s not surprising that there has been a rise in incentivised reviews — when customers get products at a discount or for free in return for a review.

What’s the harm in that, you may ask?

The problem is that data analysis by Review Meta of 7 million Amazon reviews has found that incentivised reviewers give higher ratings.

That’s not a surprise – people are more likely to be positive about something if they haven’t paid for it. Particularly if other customers are judging it against the (full) price they paid.

But Review Meta’s point is that these reviewers aren’t just normal people being offered a discount on something they were going to buy anyway; they’re serial reviewers who are picking up stuff on the cheap.

People are buying products on the back of reviews that have essentially been paid for. - Highlight to share -

These incentivised reviews can have quite a noticeable effect: reviewers were found to rate products 0.4 stars or higher. That’s enough to send an average Amazon-reviewed product from the 54th percentile to the 94th percentile.

So people are buying products on the back of reviews that have essentially been ‘paid for’.

It also fundamentally puts into question using Amazon reviews as a basis on which to make a purchasing decision. There are too many reviews and products to make weeding out incentivised reviews practical.

Full disclosure: we allow brands to send products to people we pick at random in exchange for a review ONLY IF the product hasn’t gone on sale yet. Once it’s in the stores, it’s purchasers only.

A solution could be to show the score of the verified, un-incentivised reviews as the default option and let people see the others if they choose. That’s the service that Review Meta are trying to offer; the software removes the reviews that it determines to be ‘unnatural’.

Review Meta then monetises through the Amazon Associates programme.

One person’s comment under Review Meta’s video summed the whole situation up:

“I recently watched this video and I am happy to give it 4 stars out of 5. The information was presented in a highly entertaining way and was very informative. In the interest of disclosure, the maker of this video did provide me with a free copy to watch.”

The best way to get authentic, honest reviews is to use a closed reviewing system. - Highlight to share -

But without getting too meta on you, we’d argue that the service Review Meta provides is just as biased. Their algorithm, at least for now, is far from accurate.

One review, for example, has been deemed ‘untrusted’ for using repetitive phrases but they’re simply product descriptions. And in some cases, you can see that both the most trusted and least trusted review are the same review.

review meta

So what’s the answer, you may ask, if Amazon provides biased reviews and Review Meta don’t do the best job of filtering them out?

The best way to get authentic, honest reviews is to use a closed reviewing system.

That means collecting reviews only when proof of purchase has been received from the online shop, the manufacturer or the customer themselves (a receipt).

In the interests of openness and transparency, here’s our magic recipe:

Reevoo process

No ‘write a review’ button, no open submissions, confirmed purchasers only.

That’s what Reevoo does.

The bot army — the latest threat to open review systems