My friends were confused: "Amazon had a five-star rating!"
I pulled out my laptop and checked the product page. Sure enough: 37 five-star ratings. But this thing was undoubtedly a lemon. What the hell?
Puzzles solved: Every single evaluation was a fake.
Fake news, make fake reviews
What is a fake review? Exactly what it sounds like: a review posted by an employee of the company who has paid an individual or another person who is interested in selling more products. Last year, for example, the skin care brand Sunday Riley was caught calling on employees to post fake reviews on Sephora.
This is a serious problem, and in my capacity asI see all the time – mostly with products sold by small or foreign companies.
One or two fakes: no big deal. Many of them: Now you have an artificially inflated product review. It's far too easy to look at a four- or five-star average and think, "OK, that must be good!" Few people will take the time to search for red flags in every report – or reviewer.
Here is a good example: You are in the market for a GoPro-style action camera. A real GoPro will give you hundreds of dollars, but there are countless discounts that range from $ 40 to $ 50. But they can not possibly be that good, right? Well, look like GoPros. They come with a lot of accessories. And here's the kicker: top scores from dozens or even hundreds of reviewers. Sold!
The problem is, tens or even hundreds of these reviews could be fake – or at least questionable. It's hard to know, but there are telltale signs. More below.
But should not Amazon do something about it? Several years ago, the company had promised to take actionie those offered in return for free or discounted products. Certainly I do not see any reviews with this embedded disclaimer – which does not mean that illegitimate reviews has dropped.
In my world, where I often write about lesser-known tech brands and products, not much has changed , So let's talk about the tools that allow you to spot fake reviews and, just as importantly, how you interpret the results.
X marks the Fakepot
Fakespot, a free website that analyzes product ratings, helps you to separate the wheat from the false counterfeit. You only need to copy and paste the link onto the product page and then click Analyze.
You can also use a browser extension for Chrome, Firefox, and Safari. This makes it even easier: Click the Immediate Analysis icon on the toolbar for the Fakepot icon. It's also available for Android and iOS, so you can use Fakespot on the go.
Fakespot initially focused its algorithms only on Amazon, but later added support from TripAdvisor and Yelp. Last week, the company launched search engines for Best Buy, Sephora, Steam and Walmart. Incidentally, by the way, Fakespot found that just over 50 percent of the Walmart ratings were "inauthentic and unreliable," while less than 5 percent of the "Best Buy" ratings were the same.)
The system analyzes both the ratings as well as the examiners. In search of questionable spelling and grammar, the number of reviews, buying patterns, mismatched data, and other telltale signs of suspicious review activity. For example, a reviewer who's new to Amazon has just posted one rating and uses many words like "great" and "amazing." This review is almost certainly considered "unreliable".
Upon completion of the analysis, Fakespot will provide a letter rating based on the total number of ratings and the number of unreliable ratings. And there can be some confusion: If you look at one of the above cameras and get an "F" because, for example, 57 percent of reviews were rated unreliable, you may be less inclined to buy it.
Ah, but does that mean that the product itself is bad? Not necessarily. More in the next section.
Next up is ReviewMeta, an Amazon analytics analyzer that takes a very different approach, according to developer Tommy Noonan. Although it is functionally similar – insert into an Amazon link or use one of the browser extensions – ReviewMeta only removes the weight of certain ratings or reduces the weight of certain ratings, giving you a customized rating.
In other words, instead of the letter note, which may be misleading, ReviewMeta indicates what the average Amazon rating would look like if the questionable ratings were not there.
Here it gets interesting: Often, Fakespot and ReviewMeta reach very different conclusions about the ratings of a product. I saw how it happened when one tool passed the assessments and the other stated that they were mostly counterfeits. If we can not always trust the ratings shared by Amazon customers, can we trust the ratings of those ratings?
It's a challenge to be sure. Noonan said to me, "It's impossible for someone to definitively determine if a review is" wrong "or" real. "Not even a human can do that, so it's impossible to really determine exactly how" Fakespot "or ReviewMeta is . "
Noonan says that he designed ReviewMeta with this idea, and so he shares the reports in as much detail as possible. "The tool is not really meant to give you just a black and white answer," he says, "but it shows you all the data we can and then lets you make your own decision."
And I think that's the key to take away: Note that rating a product might be artificially bloated, and use tools like Fakepot and ReviewMeta if you think you're not getting an accurate picture. Keep in mind, however, that these analyzes may also have accuracy issues and that they may not necessarily reflect the quality of the product itself.
The Atech earphones featured in this story are a perfect example. They have an average rating of 4.3 stars from 16 Amazon customers, which suggests a solid product. However, according to Fakespot, only about 62 percent of reviews are reliable. ReviewMeta gives the number to only 50 percent and leads to a lower rating of the earphones: 3.9 stars.
My advice: Take everything with a grain of salt. Do not believe everything you read. Use common sense. This is good advice, whether you are shopping at Amazon or.
Did you have a fake reviews test? Have you ever bought something and know that the reviews were questionable? What was the result?
Originally posted on February 20, 2017.
Update dated March 4, 2019: Updated information.