Two reviews on the TripAdvisor listing for the same hotel. The first: “A hidden gem that you must try.” The second: “Hidden gem in Mykonos.” Something smells funny. Both reviews are vague on details, and there is something strange behind it. There are many things that we would call a decent Marriott, but a “hidden gem” is not one of them. There is nothing hidden about a hotel that belongs to the largest hotel group in the world. Is it a coincidence that two nearly identical reviews use the same language?
No, both reviews are false. And like them, there are thousands more on the platform.
Last year, the platform concluded that almost a million reviews submitted for inclusion on TripAdvisor, equivalent to 3.6% of the total, they were false. In a second report, the trip planner website noted that 67.1% of fake reviews had been detected before sneaking onto the web through its moderation algorithm. In total, Tripadvisor has sanctioned 34,605 properties for fraudulent activities in less than a year and banned 20,299 members.
The report also brings to light the reality of paid reviews, the phenomenon that has emerged from companies trying to improve their rankings with positive reviews and reap the benefits that accrue from it. The company claims it removed paid reviews from 131 countries last year, including “an increase” in such reviews originating from India, but not necessarily for companies in the country.
The Asian country tops the list from countries that often produce personalized reviews. Russia is also in the ranking. In addition, the company has identified more than 372 websites that are dedicated to this business on the Internet and earns revenue from it.
While they pose the greatest threat to the integrity of the platform and those of the companies that live on it, fake reviews aren’t the only ones getting removed. In total, more than two million submitted reviews (representing 8.6% of the total) are rejected or removed from the Tripadvisor platform, for example, because they include insults.
Everywhere, but easy to spot
And as you might expect, one of the main challenges that customers themselves encounter with these platforms is detecting the legitimacy of the reviews. While TripAdvisor goes to great lengths to make sure they are true, there’s not much you can do when the platform can’t verify that someone has stayed at a hotel. But generally, they are quite easy to spot. Some of the common signs include:
- The person leaving a review is new to the community and has not left any other reviews.
- There are a ton of new reviews at once, mostly from accounts with limited history.
- The reviews are short and do not contain much detail.
- They use the same kind of language and grammar, perhaps even regurgitating the hotel’s marketing bullets.
In this image you can see an example of a fake review:
In 2019, Tripadvisor was criticized by consumer group Which? on what they called “suspicious” comment patterns from employees and blamed the company for not stopping this trend or cracking down on hotels that abuse the system. But James Kay, UK director of TripAdvisor, explained that his website looked for them day and night, under the stones. “We do it more than any other platform,” he noted in a BBC interview. After that, TripAdvisor removed 730 five-star reviews of a famous Jordanian hotel.
In Italy, TripAdvisor collaborated in an accusation that it sent to a seller of reviews positive to jail. According to an EU directive that has been in place in the UK since 2008, hotel staff who post favorable reviews of their property on travel information websites such as TripAdvisor is committing a crime. Any company that breaks the rules can face legal process, heavy fines, and possibly even jail time for its staff.
But the problem does not only affect TripAdvisor. In Spain, the OCU analyzed between June and August more than 6.3 million reviews on some 47,000 products marketed on Amazon Italy, Spain and France, as well as hotels on TripAdvisor and Booking, and the conclusions were conclusive. In the Jeff Bezos platform, 9.38% of the products have a significantly altered valuation by unnatural opinions (13.4% in the case of the smartwatches) by 6.20% and 2.10% in the case of TripAdvisor and Booking hotels, respectively.
His report yielded other data of interest such as the detection of 75 Amazon sellers, mainly Chinese companies, who openly offer their free products in exchange for positive feedback. Just search on Facebook or Telegram for the term “Amazon reviews” to check the underworld behind this phenomenon.