We have more choice today than ever before when it comes to buying both the goods we need and the goods we want. When we have about a jillion paper towel holders competing for our attention, it can be difficult to figure out which one will work best for us. Ecommerce has made this even more difficult, as we’re left trusting the pictures put up by the manufacturer to help us decide what to buy since we can’t hold the item in our own hands before purchasing. To counter this, we now have product reviews that help us look at the good and the bad of what we’re about to add to our cart, straight from people who have already bought the product.
Unfortunately, these reviews aren’t always what they seem. Often the volunteer consumers who take a moment to share their thoughts aren’t volunteers at all; they’re paid shills.
CROWDING THE CRITICS
Professional critics tend to have a lot of cynicism surrounding them. Part of this is because we, as humans, have difficulty understanding why someone wouldn’t like something that we love. With the rise of the user-generated content of Web 2.0, a lot of power has been handed over to the consumer. Now, almost every site that sells a product has the option for their users to submit their own opinion on it. Everything from the food you order for delivery to your family doctor has a rating chosen by the public available online.
However, the public is a fickle group that will one day cheer for your quality, affordable product, only to turn on you the moment your cellphone explodes in their pocket. This is usually accompanied by a slew of 1-star reviews that can utterly sink what was intended to be the newest, hottest thing.
Think about it, if you’re checking out a product page and you see an abysmal user rating, chances are you’re going to check out your other options. A negative rating is poison to a company. To turn this around, companies as large as Microsoft and as small as the family restaurant down the block will often try to crowdsource positive reviews. Sometimes this is merely dishonest, like when they hold contests that ask people to submit 5-star reviews to be entered into a draw, other times it’s straight up illegal, like when they outright pay people to send those 5-stars their way.
HAND IN THE COOKIE JAR
This form of astroturfing can actually be a pretty dangerous game for all involved. Back in 2014, Bell was caught and fined $1.25 million for encouraging their own employees to leave positive reviews of their app. Some of the company’s higher-ups even got in on what they probably assumed was simple corporate cheerleading. A sting operation performed in New York pulled in 19 companies offering to provide favourable reviews for a fee, resulting in fines totaling $350,000. A business putting up its own favourable reviews, either by having their employees do it or by outsourcing to another firm, is against consumer protection laws for deceitful advertising.
And yet, simply publishing an application on a mobile app store typically results in a deluge of offers from such firms willing to raise the profile of the software by inundating it with 5-star reviews. Writers with a deficiency in moral fiber advertise their services openly on freelance sites. It’s all perpetuated by the old fallacy of, “Everyone else does it, if I want to compete, I have to do it too!”
LYING FOR THE MAN
Do you have few scruples and a mighty need for some extra pocket change? Good news: a career in ruining the internet for everyone might just be for you! A perusal through Craigslist or Kajiji offers countless opportunities to write phony words of praise, offering anywhere from $5-$50 for you to tell other people on Yelp how great the food is at a restaurant you’ve never eaten at.
In all seriousness, astroturfing through fake reviews is a tough problem that likely won’t go away anytime soon. It's often estimated that around one-third of all consumer reviews are fake. Corporations like Google and Yelp that host these reviews do their best to combat falsified praise, but it’s a game of deception, which means the most professional have methods to remain undetected. Algorithms have been created that can detect fake reviews with an accuracy of around 90%, but as these things go, if you create a better lie-detector, someone will create a better politician.