Lies, damn lies and election polls: Why GE2015 pundits fluffed the numbers so badly

The lessons of shaping a mathematical 'reality'

Lies, damn lies and statistics

There is of course a “statistical” explanation for this. As every polling organisation will patiently explain many times over, forecasts don’t tell you precisely what the result will be. They are subject to a “margin of error”. This, in turn, varies according to how many you have polled, and how close to the actual outcome you want to be.

That is why the small print of many polls includes an explanatory note to the effect that the forecast is “plus or minus x per cent with 95 per cent confidence”. That last bit is critical. It indicates that the pollster is 95 per cent confident, assuming that nothing has gone wrong with the polling process, that the right result will be within the margins they just told you about. Clear? In non-statistical speak, that means they just told you that one time in 20, they will get the result wrong.

As for the range of the margin, that, too, varies according to the number of individuals polled. For a survey polling around 1,000 voters, you end up with a standard error of three per cent, which rather neatly matches the error we got. Just add three per cent to the Tory forecast – and subtract three per cent from the Labour one.

Statistics to seats

The second failure relates to the number of seats relative to votes. In a non-proportional system, as the leaders of the smaller parties have repeatedly pointed out, the proportion of seats only imperfectly represents the proportion of votes. Those parties with greatest geographic spread of votes will fare the worst, which is why the SNP, with support concentrated in one corner of the UK managed to return 56 MPs with 1,454,436 votes (25,972 votes per MP), while UKIP, which drew its support far more widely, achieved just one MP with 3.8 million votes (see below).

Conservative 11,334,920 34,244
Labour 9,347,326 40,290
UKIP 3,881,129 3,881,129
Lib Dem 2,415,888 301,986
SNP 1,454,436 25,972
Green 1,154,562 1,154,562
Other 1,103,419 52,554
Total 30,691,680 47,218

Here, the central issue is the number of seats obtained by the Lib Dems because, in terms of percentage vote, the polls were not far out: where they erred significantly was the way they converted votes to seat numbers.

Predicted Exit poll Actual
Conservative 273 316 331
Labour 273 239 232
UKIP 3 2 1
Lib Dem 27 10 8
SNP 52 58 56
Green 1 2 1
Other 21 23 21
Total 650 650 650

Table 3: ICM Poll of Polls seats forecast (6 May 2015) vs. exit poll and actual seats gained (7 May 2015)

ICM Poll of Polls seats forecast (6 May 2015) vs. exit poll and actual seats gained (7 May 2015)

How credible are these explanations?

Across several dozen surveys over a period of many weeks, every major polling organisation came to the same conclusion: a dead heat. Many individual polls carried out were bigger than 1,000. Between 4-6 May alone, the Survation/Daily Mirror survey polled 4,000 electors: YouGov/The Sun surveyed over 10,000.

With such consistency, the margin of error argument cannot stand.