Our 2017 General Election Prediction
UK General Election 2017 Poll Tracker
-
Election Data Journalism Roundup (16th May 2017)
Is there evidence for pollster herding in current polling? Tories are due for a large victory in Scotland if local election success translates into general election votes. A new category of voter, the "Re-Leaver", is defined. Why polls are imperfect predictors of election results.
Read more... -
The Tories Are Hiding: Why Were The Polls So Wrong In 2015?
The night of the general election in 2015 saw widespread shock on the faces of political campaigners, journalists and the public as the predicted hung parliament morphed before our eyes into a Conservative majority.
Read more...
For months the polls had consistently shown Labour neck-and-neck with the Conservatives, leading to a slew of opinion pieces on how to navigate a hung parliament, what a minority government would mean for the UK, and whether Ed Miliband was quaking in his boots at the thought of having to work with the SNP.
But the polls were incorrect - the Conservatives stormed ahead with 331 seats to Labour’s 232 and formed a majority government. Both of these figures were over 40 seats away from their predicted seat count – a huge margin of error. Although the vote share predictions for the other parties were reasonably accurate, we cannot ignore the fact that polling failed to represent the true state of our two major political parties. -
Why Opinion Polling Is More Art Than Science
"The problem is that when the polls are wrong, they tend to be wrong in the same direction." – Nate Silver
Read more...
In this post, we will dig a little deeper into the strange world of opinion polls. One of the challenges mentioned in our first article about polling was the difficulty of randomly finding a group of people whose views reflect the overall population. To use polling terminology, finding a representative sample of people to answer your questions can make the difference between a poll being spot on or wildly inaccurate.
This blog looks at two techniques that pollsters deploy to find a representative sample, random dialing and panels. It also looks at how the use of something called weighting can be applied to poll results to make them more representative. -
Election Data Journalism Roundup (7th May 2017)
What do this week's local election results mean for general election forecasts? The Conservatives' conquest of local councils is turning some parts of Britain blue for the first time. Labour might be winning the battle on social media, but does it matter?
Read more... -
Election Data Journalism Roundup (3rd May 2017)
Labour losing support on three fronts. Is Wales the new Scotland? What are the chances of a Le Pen victory in France?
Read more... -
How Polling Works (And Why It’s Difficult To Do Right)
"Not everything that counts can be counted, and not everything that can be counted counts."
Read more...
Elections mean many things – campaign rallies, manifestos, a debate about whether there will be a debate, and, of course, polls.
Polls generate daily headlines in an election – even, or especially, when they contradict each other. Despite failing to accurately predict the result of either the 2015 general election or the Brexit referendum, a new poll can still be front page news. Public trust in opinion polls has plummeted, with one survey finding that 75% of adults don’t trust surveys.
So, how does polling work? Why is it so difficult to get right in the UK? Our goal is to explain enough essentials to get you through most conversations on election polling, as well as enabling you to cast a critical eye on the headlines that polls generate between now and the election. -
Error Margins Explained In Three (±1) Minutes
How is it possible that an opinion poll claims to know what the whole country thinks just by asking a few hundred people?
Read more...
Ask a hundred people if they like cats and around 63% will say yes, give or take a few. If you want to be more accurate, you can even say that the number will be between 53 and 73. What if you asked a thousand people? Perhaps you’ll find 530 to 730 ailurophiles this time?
"Wisdom is the daughter of experience", according to Leonardo da Vinci, and experiencing more responses to our cat question means we can actually make a wiser guess - with a sample size of one thousand people you should find that 600 to 660 of them like cats. -
Introducing SixFifty
What if the general election could be informed by real data and not just speculation?
Read more...
This week 97% of MPs voted in support of holding a general election on June 8th. By calling a snap election, Theresa May is seeking to capitalise on disorganisation in the opposition and to strengthen the mandate for her Brexit strategy. As a result, the country is going to the polls for the third time in two years.
If recent revelations in the US and UK have shown us anything, it's that unbiased, informed debate can be hard to come by in the national media. Polls used to be at the heart of this discussion, but after failing to accurately predict the 2015 general election or the 2016 Brexit vote they are now seen as untrustworthy.
We don’t believe it has to be this way. We live in world where "artificial intelligence" and machine learning seem to be everywhere - yet where are these supposedly amazing technologies when it comes to something as useful as political forecasting?
Our goal is rooted in a simple question – if you brought together a group of skilled data scientists, software engineers and experienced political operatives, whose only motivation is to create something truly amazing that provides valuable unbiased information to voters, how far could we go?
About SixFifty
SixFifty is a non-partisan collaboration of data scientists, software engineers, journalists and political experts dedicated to bringing a rigorous, data-oriented, and impartial view to the 2017 UK general election.