Recently some friends purchased a GPS locator for their daughter and were having trouble getting it to work. They brought it to me for help — I’m the Geek Squad for my friends and family — but I couldn’t solve the problem, either.
My friends were puzzled: “It had a five-star rating on Amazon!”
I pulled out my laptop and checked the product page. Sure enough: 37 five-star reviews. But this thing was undeniably a lemon. What the heck?
Mystery solved: Every single review was a fake.
Fake news, meet fake reviews
What’s a fake review?
Exactly what it sounds like: a review posted by a company employee or anyone else with a vested interest in selling more products.
One or two fakes: no big deal. Lots of them: now you’ve got an artificially inflated product rating. It’s way too easy to glance at a four- or five-star average and think, “OK, this must be good!” Few folks are going to take the time to dig into each and every review — or every reviewer — to look for red flags.
Here’s a great example: You’re in the market for a GoPro-style action camera. A real GoPro will run you $200-$400 in the US, but there are countless knock-offs priced as low as $40-50. But they can’t possibly be as good, right? Well, they look like GoPros. They come with lots of accessories. And here’s the kicker: high marks from dozens or even hundreds of reviewers. Sold!
The problem is, dozens or even hundreds of those reviews might be fake — or at least questionable. It’s hard to know for certain, but there are telltale signs. More on that below.
But shouldn’t Amazon be doing something about this? About a year ago, the company promised to start, meaning those posted in exchange for free or discounted products. Sure enough, I’ve seen fewer reviews with that disclaimer embedded — but that doesn’t mean there’s been a decrease in illegitimate reviews.
Indeed, in my world, where I frequently write about lesser-known tech brands and products, not much has changed. So let’s talk about the tools you can use to spot fake reviews and — just as important — how to interpret the results.
X marks the Fakespot
First up is Fakespot, a free site that analyzes Amazon product reviews to help you separate the wheat from the, well, fake. All you do is copy and paste the link to the product page, then click Analyze.
The service also offers browser extensions for Chrome, Firefox and Safari, all of which make it even simpler: Just click the Fakespot icon in your toolbar for instant analysis. It’s also available for Android so you can use Fakespot on the go.
Fakespot analyzes both reviews and reviewers, looking for questionable spelling and grammar, number of reviews, purchasing patterns, mismatched dates and other telltale signs of suspicious review activity. For example, a reviewer who’s new to Amazon, has posted only one review and uses lots of words like “great” and “amazing”? That review is almost certainly going to be marked “unreliable.”
After the analysis is finished, Fakespot provides a letter grade based on the total number of reviews and how many were unreliable. And that’s where things can get a little confusing: If you’re looking at one of the aforementioned cameras and it gets an “F” because, say, 57 per cent of the reviews were marked as unreliable, you might be a lot less inclined to purchase it.
Ah, but does that mean the product itself is bad? Not necessarily. More on that in the next section.
Next, there’s ReviewMeta, which takes a very different approach, according to developer Tommy Noonan. Although it’s functionally similar — paste in an Amazon link or use one of the browser extensions — ReviewMeta merely strips out or reduces the weight of certain reviews, then leaves you with an adjusted rating.
In other words, instead of the letter grade, which can be misleading, ReviewMeta shows you what the Amazon average rating would be if the questionable reviews didn’t exist.
Here’s where it gets interesting: Often, Fakespot and ReviewMeta reach very different conclusions about a product’s reviews.
Grading the graders
What can we make of all this? If we can’t always trust the reviews shared by Amazon customers, can we trust the reviews of those reviews?
It’s a challenge, to be sure. As Noonan told me, “It’s impossible for someone to definitively determine whether a review is ‘fake’ or ‘real.’ Not even a human can do it, so it’s impossible to really determine how ‘accurate’ Fakespot or ReviewMeta is.”
Noonan says he designed ReviewMeta with that in mind, and it’s why he shares as much detail as possible on the reports. “The tool isn’t really intended to just give you a black and white answer,” he says, “but more to show you all the data that we possibly can and then let you make your own decision.”
And I think that’s the key takeaway here: Be aware that any Amazon rating might be artificially inflated, and use tools like Fakespot and ReviewMeta if you think you’re not getting an accurate picture. At the same time, be aware that these analyses might have accuracy issues as well, and that they don’t necessarily reflect the quality of the product itself.
The Status Audio CB-1 headphones shown throughout this story are a perfect example. They have a 4.7-star average rating from over 400 Amazon customers, suggesting an exceptional product. According to both Fakespot and ReviewMeta, however, you need to eliminate at least half those reviews from the equation, because they’re questionable in some way.
Does that mean the company has engaged in shady review practices? Or that the headphones aren’t quite as exceptional as the reviews suggest? It’s tough to say, but ReviewMeta’s adjusted score tells a better story: The 207 “good” reviews average out to 4.4 stars, so you can rest assured the headphones are probably above average.
My advice: Take everything with a grain of salt. Don’t believe everything you read. Do use common sense. That’s good advice whether you’re shopping on Amazon or, you know,.
Have you had a run-in with fake reviews? Ever purchased something knowing full well the reviews were questionable? What was the outcome?