True story: Recently some friends bought a GPS locator for their daughter and had difficulty getting him to work. They brought it to my aid – I'm the geek squad for my friends and family ̵
My friends were confused: "It had a five star Amazon rating!"
I pulled out my laptop and checked the product page. Sure enough: 37 five-star ratings. But this thing was undeniably a lemon. What the hell?
Puzzles solved: Every single review was a fake
False news, fake reviews
What is a fake review? Exactly what it sounds like: a review published by an employee of the company or someone else who has an interest in selling more products.
This is a serious problem, and in my capacity asit is all the time – mostly with products sold by small or foreign companies.
One or two fakes: no big deal. Many of them: Now you have an artificially inflated product review. It's far too easy to look at a four- or five-star average and think, "OK, that must be good!" Few people will take the time to go into any review – or reviewer – to look for red flags.
Here's a great example: You're in the market for a GoPro style action camera. A true GoPro will cost you $ 200 to $ 400 in the US, but there are countless knockoffs as low as $ 40 to $ 50. But they can not possibly be that good, right? Well, they look like GoPros. They come with a lot of accessories. And here's the kicker: good scores from dozens or even hundreds of critics. Sold!
The problem is that tens or even hundreds of these reviews could be fake – or at least questionable. It's hard to know, but there are telltale signs. More on that below.
But should not Amazon do something about it? About a year ago, the company promised to initiatewhich means in exchange for free or discounted products. In fact, I've seen fewer reviews with the embedded disclaimer – but that does not mean that the number of illegitimate reviews has decreased.
In fact, in my world, where I often write about lesser-known tech brands and products, not much has changed. Let's talk about the tools you can use to detect fake reviews and, just as importantly, how to interpret the results.
X Marks the Fakepot
First comes Fakepspot, a free website that analyzes Amazon's product ratings to help you separate the wheat from the, well, fake. Just copy the link and paste it on the product page, then click Analyze.
The service also offers browser extensions for Chrome, Firefox, and Safari that make it even easier: just click on the fakepot icon in your instant analysis toolbar. It's also available for Android and iOS, so you can use Fakepot on the go.
Fakepspot analyzes both reviews and reviewers looking for questionable spelling and grammar, reviewing the number of reviews, buying patterns, mismatched dates, and other telltale signs of activity. For example, a reviewer new to Amazon has just posted a review using many words like "great" and "amazing"? This review is almost certainly marked as "unreliable".
After the analysis is complete, Fakepspot offers a letter class based on the total number of reviews and how many were unreliable. And that's where things can get a bit confusing: If you look at one of the above cameras and get an "F" because 57 percent of reviews were marked as unreliable, you might be much less inclined to buy it.
Ah, but does that mean the product is bad? Not necessarily. More in the next section.
Next up is ReviewMeta, which, according to developer Tommy Noonan, has a completely different approach. Although it is functionally similar – insert an Amazon link or use one of the browser extensions – ReviewMeet will just remove or reduce the weight of certain reviews and then leave you a customized rating.
In other words, instead of the stamp, which can be misleading, ReviewMeta shows you how high the Amazon average score would be if the questionable reviews did not exist.
Here's where it gets interesting: Often, Fakespot and ReviewMeta get very different conclusions about the ratings of a product. As shown in the examples above.
grading of graders
What can we do with it? If we can not always trust the ratings of Amazon customers, can we trust the ratings of these reviews?
It's a challenge to be sure. As Noonan told me, "It is impossible for someone to definitely determine if a review is" wrong "or" genuine. "Not even a human can do that, so it's impossible to really determine exactly how & # 39; Fakespot & # 39; or & # 39; ReviewMeta & # 39; is. "
Noonan says he designed ReviewMeta with this intention, and therefore he gives as much detail as possible about the reports. "The tool is not really meant to give you just a black and white answer," he says, "but more to show you all the data we possibly have, and then you can make your own decision."
And I think that's the key to take away here: Be aware that any Amazon rating could be artificially bloated, and use tools like Fakepot and ReviewMeta if you think you're not getting an accurate picture. At the same time, you should be aware that these analyzes also present accuracy issues and may not necessarily reflect the quality of the product itself.
The Vantrue T2 Dashcam shown in this story is a perfect example. It has an average rating of 4.6 stars from over 400 Amazon customers, indicating a rather extraordinary product. However, according to Fakepspot, you have to eliminate nearly 90 percent (!) Of these ratings because they are questionable in some way.
Does that mean that the company has interfered in dubious review practices? Or that the dashcam is not quite as extraordinary as the reviews suggest? It's hard to say, especially considering that ReviewMeta considers most of them "natural." According to their analysis, around 85 out of 105 reviews are real. The corrected score tells a better story: the 207 "good" ratings averaged 4.4 stars, so you can be sure that the headphones are likely to be above average. (.)
My advice: Take everything with a grain of salt. Do not believe everything you read. Use common sense. That's good advice, whether you're shopping at Amazon or.
Have you ever had to deal with fake reviews? Have you ever bought something that knew well that the ratings were questionable? What was the result?
Originally posted on February 20, 2017.
Update, September 27, 2018: New information added.