The Biggest Trends in the Beverage Industry Right Now
By Michaela VoglSep 13
Discover the hottest Food and Drinks Trends of 2023 in our exclusive report 🍕
Published January 20th 2016
This week the British Polling Council released their long awaited report on exactly what went wrong around 2015’s UK General Election. Not the result, but why none of the polls saw it coming.
All were, unusually, consistent in their predictions that the most likely outcome would be a hung parliament. No major poll predicted a Conservative majority. This consensus brought into question not just the merit of any one polling body, but the validity of the polling industry as a whole.
To be fair to them, the Conservative victory was quite a subtle shift.
The Tory share of the popular vote had changed by less than one percent – from 36.1% to 36.9% – since 2010. It was only when this change was distributed via the first-past-the-post electoral system that it translated into a majority – 331 of the 650 contested seats. This was hard to measure.
[source]
While the electoral maths is sound, the surprise of such a result can still be unsettling, not least to the 63.1% who might continue to question their government’s mandate post-victory.
Which is why a report on polling failures is so welcome to both government and voters, in assuring them it was the polls that were flawed, not the election. Essential for trust in the leaders.
But the group for whom this report matters the most is the pollsters themselves, whose very raison d’être was undermined the moment the BBC exit poll (the first poll that, with retrospect, gave any indication of the result) pulled a rug from beneath them.
For social media analysts, we can watch the circus from afar, smiling wryly in the comfort of our own glass houses. Brandwatch’s Twitter analysis around the time was even more misleading than the polls, suggesting an even greater lean to the left than the pollsters foresaw.
[source]
Although we were cautious with our caveats that this chatter wouldn’t necessarily correlate with voting intention.
We know well the biases and demographic skew of the data we deal with, so we didn’t need to await a report to be aware of the problems highlighted today. They were familiar to us.
The polls, the BPC reports, were wrong because:
1) They under-represented older voters, who are less likely to engage with online polls.
2) They over-represented politically active younger people, who may tend to be more left-leaning.
3) They heard more from the time rich than the time poor – i.e. the one’s more likely to answer the door to a pollster rather than those with, say, a job.
“In the face-to-face British Social Attitudes survey, Labour was six points ahead among respondents who answered the door at the first visit, whereas the Tories enjoyed an 11-point advantage among interviewees that required between three and six home visits.”
Social analysis has similar problems.
The demographic of social media is, on average, younger than the demographic of the UK.
It is also skewed by the louder voices, such as politically active youth, those most grateful of a platform where their voice may matter. And those with time to engage, those without better things to do, are obviously more visible than those whose talents are occupied by concerns other than the composition of witty 140 character epithets.
These are all givens.
Social analysis has a few extra distortions of its own too.
We are unable to extract the opinion of lurkers, the 76% of twitter users who generate no content. And of the amplified voices we can measure, presentation bias is a problem – the way users distort their own online images. We know social media users are more likely to voice socially conscious, groovy opinions than they would express more unfashionable leanings.
These biases don’t apply within the privacy of a voting booth, and are only a minor factor in traditional polling which proactively seeks true opinions.
Every participant in a poll or election is invited to share their thoughts. They are motivated not by a vaguely gamified urge to be part of a network, but by a targeted method of opinion extraction. By contrast, social analysis of political opinion is, at best, highly tangentical.
Which is why traditional polling should, in theory, give a more representative view than social analysis. When a social media based prediction comes even slightly close, as with our spookily accurate prediction of the 2014 EU Election, it is worthy of remark.
But there is one significant advantage social analytics has over traditional polling – sample size.
With a traditional poll the sample size will always be smaller, much smaller, than the data being measured. But social data is big. Very big.
A General Election is a sampling of the populace in the same way an opinion poll is. A vote with a good turnout (66% in the election in question) can claim to be a good sample. Good enough to decide who runs the country.
An opinion poll on the other hand can never hope to reach even 1% of the populace, and even that would be unprecedented (1% in the UK would mean getting a meaningful response from 640,000 people). Typically pollsters will measure the responses of one or two thousand participants. The art of good polling then is to make sure that these tiny samples are representative of the group measured. This is what the BPC report recommended, that “there needs to be a shift in emphasis away from quantity and towards quality”.
With social media analysis we have the luxury of not having to choose between one or the other.
One 2015 report claimed 59% of UK citizens are on social, based on the calculation that there are 38 million “active social media accounts” within a population of 64 million. Even if all these accounts don’t represent unique individuals (which is probably the case), this figure of 59% is interesting because it just so happens to match the percentage who voted in the 2001 UK General Election.
It suggests, even if the figure is wrong, that we’re heading towards a point where the sample size might equal or exceed the size of the group being measured.
While I might question a claim that 59% of the UK are active over-sharers, I’m more accepting of the surveys that suggest 80-90% of 16-24 year olds regularly use social media.
[source]
If these habits were to continue into later life, combined with the ongoing decline in election turnouts (it’s now difficult to believe 83.9% turned out for the 1950 General Election) it would mean that we could feasibly, within a generation or two, reach a point where there are more UK citizens on social than there are voting in elections.
[source]
Lets assume, for the sake of argument, that in the intervening time the networks become more sophisticated (which they will), our analysis of this data evolves (which it will) and we develop mechanisms for more targeted opinion extraction from social (which we might).
Imagine a measure comparable with voting or poll participation, but with the scale of social.
It would be a measure of democratic representation more reliable, more open, and many magnitudes more dynamic, than the election process. Not to mention millions of pounds cheaper.
If we had this measure, and we then had a situation like we did in 2015, where the analysis disagreed with the election result, what conclusion might we draw then? If the sample size of social were bigger than the sample size of the election, which result would then be the one questioned?
This weeks attempts by the polling industry to justify itself, and calm our suspicions of unrepresentative government, is a fun sideshow.
But it’s also a portent.
There’ll come a time when we have easier and cheaper ways to gauge public opinion than polls. And there’ll come a time when we have easier ways to gauge public opinion than elections.
Offering up analysis and data on everything from the events of the day to the latest consumer trends. Subscribe to keep your finger on the world’s pulse.
Consumer Research gives you access to deep consumer insights from 100 million online sources and over 1.4 trillion posts.
Existing customer?Log in to access your existing Falcon products and data via the login menu on the top right of the page.New customer?You'll find the former Falcon products under 'Social Media Management' if you go to 'Our Suite' in the navigation.
Brandwatch acquired Paladin in March 2022. It's now called Influence, which is part of Brandwatch's Social Media Management solution.Want to access your Paladin account?Use the login menu at the top right corner.