Does Twitter Conversation Have an Effect on Tesla’s Stock Price?
By Lena HöckNov 16th
Published October 2nd 2018
Millennials love reviews.
Apparently, 8/10 of them won’t buy anything without first reading a review – a finding that seems overstated, but it’s part of a large body of other studies that seem to share the same sentiment. Millennials, distrusting of ads, prefer to read about products and services from the perspective of their peers.
It makes sense. I’ll rarely purchase anything from Amazon without checking what others have said about the quality of the product, shipping times and other things people have raved or complained about – often reviews can answer the important questions that product descriptions just don’t.
We may all find reviews helpful, but reviews are not perfect. In many cases, reviews can be gamed or falsified in either the interest of the greedy seller or the troll-y consumer. Representation is also important – not all product or service users will be willing or able to provide online reviews.
We’ve already discussed disastrous events, dubious e-commerce practice, phoney influencers and catfish in our ‘Fake it’ series, but this time it’s the shady side of reviewing that we wanted to explore.
In what is either considered a legendary prank or a total waste of time, last year Vice’s Oobah Butler embarked on a new project – to turn a shed into the most highly rated restaurant in London.
Using his network of friends to create reviews, as well as some pretentious photography to show of the cuisine (in one instance, an egg is balanced on a foot), ‘The Shed at Dulwich’ began to climb up the TripAdvisor rankings.
Eventually, without ever opening its doors, the shed had become the highest rated restaurant in London, according to the review site. The crescendo of the prank was when Butler opened the restaurant for one night only and served meals from the budget supermarket Iceland to visitors. Some of them tried to book again.
Butler was inspired to try this out by his early career. He writes:
“Restaurant owners would pay me £10 and I’d write a positive review of their place, despite never eating there. Over time, I became obsessed with monitoring the ratings of these businesses. Their fortunes would genuinely turn, and I was the catalyst.”
TripAdvisor pretty much dismissed it as a prank – after all, who would benefit from creating a fake restaurant? But Butler’s project is a great example of how easy it is to skew reviews when you’ve got enough budget or willing accomplices.
Let’s follow a new tangent and consider the narcotics industry. After all, the digital forces that allow you to order groceries to your home at a particular time of day, or get a ride across the city with a few taps on your smart phone are the same ones that are rapidly changing the international drugs trade.
In ‘Narconomics: How to Run a Drug Cartel’, Tom Wainwright writes:
“Until now, getting hold of drugs has been difficult and often unpleasant, requiring a network of dodgy contacts or a nerve-racking trip down a dark alleyway. Buying online makes it easy and lends an almost respectable face to a grubby business…If you are capable of buying a book on Amazon, you are probably up to buying crystal meth on the Dark Web.”
There’s plenty of room for fakery.
Vendors can stand out with simple user interfaces, helpful customer service and competitive delivery times. But while digital transformation may have put a friendlier face on the consumer side of the narcotics industry, in many cases it’s not done much to clean up the violence and exploitation at earlier parts of the supply chain. Wainwright discusses instances of cocaine sold as “Fair Trade” and “Conflict Free” which he describes as “a boast that is patently untrue, given that the world’s cocaine supply is controlled by a group of murderous cartels.”
Despite the wild claims and general dubiousness of buying illegal substances from strangers, there are some safe guards in place. Review systems make it easier for consumers to discern the quality of the product and the reliability of the seller, meaning that the shoddiest players are less likely to do damage.
Of course, as Wainwright notes, you won’t find reviews from the dead. For a review system to have even a small amount of reliability, you’d like to think there’s representation from those who are both happy and unhappy with a product’s effects, regardless of the industry.
Maybe a perfect review system is a bit too high a goal to shoot for, but there are certainly some things that can be done to improve trust in online reviews.
At the very least, some form of verification around whether the review author has actually tried the product or service in question would make total sense. But that validation isn’t always easy.
For example, G2 Crowd is a great example of a review system that champions all user opinions while also making sure customer reviews come from real customers. Their community guidelines say: “After our automatic filtering process removes reviews that do not meet our minimum submission requirements, our team manually checks each review. All reviews must pass our moderation process before they are published. The “Validated Reviewer” label denotes how the reviewer was authenticated. The “Verified Current User” label indicates that the review includes an approved screenshot of the reviewer logged into the software.”
The problem is, having a moderation team and a verification process is not going to be easily practicable for companies like Amazon or TripAdvisor who must log an eye-watering volume of reviews every day. Whether automation can help here is up for debate – it feels like without human moderators review systems can easily be gamed.
Reviewing people is a meaty issue in 2018.
Shows like Black Mirror show the scary side of this, but in reality we’re already reviewing our Uber drivers and they’re reviewing us. Meanwhile, in China an admittedly terrifying sounding mandatory social credit system is being trialled. Business Insider describes it here:
“Like private credit scores, a person’s social score can move up and down depending on their behaviour. The exact methodology is a secret — but examples of infractions include bad driving, smoking in non-smoking zones, buying too many video games and posting fake news online.”
Given that we can’t currently find a fool proof way of reviewing products and restaurants, reviewing people seems like a step too far. Adding a personal element opens the door to all kinds of malice and inauthenticity – at least it feels that way, compared to the more impersonal things we might say about a speaker with a faulty USB connector.
Labelling theory, something developed by Howard Becker, describes the ways in which the terms used to ‘label’ people (like “criminal” or “badly behaved”) can affect the ways that individual behaves – essentially, labels become like self-fulfilling prophecies. Attaching a numerical value to someone’s value or behavior, especially a poor one, doesn’t feel like a sensible way to make things better for everyone.
Perhaps we’re taking our love of reviews too far.
Allow us to drag you through the muddy waters of influencer marketing dark arts in the third edition of our Fake it 'Til You Make it series.Read the article