The UK Election: 7. What Can Polling Predict?
Understand the UK Election is a simple 10-part guide to everything that is going on in the election, with Adam Fleming.
In this episode, what do polls actually tell us? What can we learn when the polls get it wrong? And do people really tell the truth about who they are intending to vote for?
This episode was hosted Adam Fleming, from Newscast and AntiSocial, with Professor Jane Green, Director of the Nuffield Politics Research Centre at Oxford University, co-director of the British Election Study and President of the British Polling Council.
Producers: Alix Pickles and Alex Lewis
Production Manager: Janet Staples
Editor: Sam Bonham
Listen and follow along
Transcript
This BBC podcast is supported by ads outside the UK.
Suffs, the new musical has made Tony award-winning history on Broadway.
We demand to be home!
Winner, best score!
We demand to be seen!
Winner, best book!
We demand to be quality!
It's a theatrical masterpiece that's thrilling, inspiring, dazzlingly entertaining, and unquestionably the most emotionally stirring musical this season.
Suffs!
Playing the Orpheum Theater October 22nd through November 9th.
Tickets at BroadwaySF.com.
BBC Sounds, Music, Radio, Podcasts.
Hello and welcome to Understand the UK Election, your essential guide to the general election.
I'm Adam Fleming.
My day job is presenting the BBC's daily news podcast Newscast where there is loads of coverage of the UK general election campaign.
I've been covering these since 2005, meaning this is my sixth general election as a professional journalist.
And I have to say, some of my most vivid memories of politics are about general elections.
Although my vivid memory of opinion polling, and that's the subject of this episode, actually comes from the 2014 referendum on Scottish independence.
I was the political correspondent on duty the Sunday morning before the vote, and the big story that morning was an opinion poll that showed the yes to independence campaign in the lead.
That was like a bomb going off in the no campaign against independence and that then became the story that day.
So let's talk more about the art and the science of opinion polling.
And to do that, I'm joined by my guest, Jane Green, who is professor of the Nuffield Politics Research Centre at Oxford University.
She's also co-director of the British Election Study, and she's president of the British Polling Council.
Madam President, hello.
This is quite a recent title, but thank you.
What does the British Polling Council do?
British Polling Council exists purely to increase transparency and accountability and political polling.
So anybody who becomes a member of the British Polling Council, they'll provide the sample, the field work dates, who they've done the poll for, what are the sort of distributions, the percentages on various different questions and demographics, and information about how they've weighted the data.
And I should say we're recording this episode of Understand the UK election quite early on in the campaign.
So things may have changed by the time you're actually listening to this.
So let's just talk about kind of in broad terms since 2019, the last time we had a general election, what has been the big kind of meta story that the polls have told us?
What you can really understand from opinion polls particularly well is the direction of travel.
The direction of travel since 2019 has been downwards for the Conservatives and really very substantially.
So since the Truss mini budget, which was the 49-day period when Liz Truss was Prime Minister and she announced this sort of almost a budget statement around taxation and so on, to which the markets reacted extremely negatively.
Since then, we've seen a collapse of trust for the Conservative Party in terms of their reputation for economic competence.
You know, not just the vote intention change, but also under the surface, who's trusted on what.
Not only is there kind of a downward trend, but those moments also really significant because you start to see a sort of drop off a cliff moment and then the downward trend again and then a drop off a cliff moment.
And if anything, there's been a bit of a more negative movement as Reform UK, you know, the party, the success party to UKIP and the Brexit party started to gain some support.
But the really important thing about the opinion polls right now to understand is that there's a real divergence between pollsters of what the Labour lead is.
Is it really, really massive?
Is it over 25%?
Or is it down on the sort of 15% lead?
And so there's a really big difference.
And how are the Lib Dems and the Greens doing on the graphs?
So the Greens have been doing slightly better overall, and the Lib Dems a little bit too.
But what's important for those parties is actually not so much, I mean, of course, it matters nationally if their support starts to increase, but it matters where it increases.
They need to get all of that vote stacked up in a few places.
Yeah, that's a very good bit of advice for interpreting the smaller parties' poll numbers.
It matters where that is.
Although I suppose that matters for all the political parties though, doesn't it?
Because of the electoral geography of the country and our voting system.
Like you need you need to have the votes in the places that are going to help you win a majority, not necessarily just win the most votes.
Yeah.
So the the two largest parties need to do both, because obviously if most of the constituencies in the country are conservative labor races, then your sort of national polling polling support is crucial to that.
The distinction for the minor parties is that their vote currently does them some damage if it's evenly distributed.
And I'm now just thinking back to all the lessons you get taught as a baby political reporter about polls.
Well, the main one is it's a snapshot in time
of that moment,
which does not mean it's a prediction of what the outcome of an election is going to be.
Exactly.
So they're telling you what the sentiment is right now.
The other thing remember is that polls have a margin of error, and that's based on sampling statistics, based on how big the sample is.
The key distinction there is don't say, okay, well, there's been a two-point increase in the Conservative Party vote intention in that poll and a two-point drop in the Labour Party.
Oh, there's a four-point gap.
It's really significant.
No, those two answers are in the margin of error.
Nothing's changed.
And that gets over-interpreted all the time.
Oh, yeah.
Well, and the second lesson you learn is that you have to look at the trends rather than an individual result on a day.
And also across the pollsters.
I mean, this is one of the myths about polls in 2016 was that the polls got the Brexit referendum wrong.
And in actual fact, there were as many polls putting leave ahead as there were remain.
And what you saw during that campaign in the referendum was increased support for leave and pretty much consensus across the pollsters in that regard.
The trend was moving in the same direction.
So it was there for people to see.
So let's talk about how polls are done.
So your classic voting intention poll.
Okay.
What's the main way in which they're conducted by the polling companies?
Okay.
So if you think about what you're trying to do here, you're trying to create a sample, a sample of the electorate that looks as much like the wider population as it can possibly be,
which means that you know, because you've got census data and other very, very high-quality surveys, that there's roughly, you know, this distribution of older people, there's roughly this distribution of different backgrounds and demographics.
And so you want to make sure your sample looks as much like that distribution.
And they're always trying to make sure: have we got enough people that aren't interested in politics?
This was the big lesson from 2015.
Have we got enough people from this kind of part of the country, from this kind of demographic?
You know, in order to make sure that they've got enough of a pool of people from which they can draw a representative sample.
You talked about 2015.
Why is that such an important year for pollsters?
So every single election I've worked on, there has been a big shocker of some kind.
So in 2015, the polls had had Labour ahead.
The big shocker was that Labour didn't win.
This is an exit poll, very carefully calculated, not necessarily on the nail.
But here it is, 10 o'clock.
And we are saying the Conservatives are the largest party.
Sensational, David.
An extraordinary night if...
That exit poll is right.
If this exit poll is right, I will publicly eat my hat on your programme.
Your puzzle?
The British Polling Council then established a group of people, I happened to be one of them at the time, who looked into what were the reasons then.
In that particular instance, we really deduced that this was a problem of more politically attentive people answering surveys, and that therefore the pollsters made various different kinds of corrections to make sure they could learn lessons from that particular difficult election for them.
Right, British politics loves a three-letter acronym.
Oh, I'm going to fire a three-letter acronym at you in the hope that you will explain it.
MRP.
So, multivariate stratification, no, with post-stratification.
Something with post-stratification.
Shall we look it up?
So we can't do that.
Completely correct.
Do you know what?
I'm so pleased it doesn't even roll off the tongue for you.
Multi-level regression and post-stratification.
And post-stratification.
Right.
So the post-stratification tells you something, and the multi-level bit tells you something.
So if you have a really big survey, you've not just interviewed a thousand people, you've interviewed 10,000 people or more.
Now, you've got people all over the country then, and you know things about the individuals, so you know their demographics.
You also know where they live.
And so, these models, these MRP models, are taking that individual-level data and the data about where that person lives to build the likelihood then of a area doing something
because it's...
got lots of those kinds of people living in that area.
So now supplant constituency for area, now we're going to get an estimate potentially in every single constituency because constituencies share characteristics, there's types of constituency.
So one example of that would be: okay, well, we know that there's types of constituency that are more leave-leaning, more remain-leaving.
So you're getting an estimate for every constituency in the UK of how that constituency is going to vote in a general election based on today's survey data.
So again, not a prediction, but a snapshot, but a snapshot at a geographic constituency level using that individual individual level data and the information about where different kinds of people live.
How does the exit poll work on election night itself?
So the exit poll uses the change in the vote in types of different places from the last election.
Even though it's called an exit poll and you think, oh, this is just a survey being taken, it is a survey where a number of different constituencies are selected or polling areas, so wards are selected that
enough to say something about the kinds of place that place is.
So let's say, again, it's a Conservative, Liberal, Democrat race.
And so you want the places how they voted last time, the places, how people are saying they voted this time, and you want to look at the difference in one election to the next.
And then use that to say, okay, we've got this sample of places.
Now we're going to apply that to the whole of the UK and we're going to get our estimate of what the result's going to be.
And so individuals are literally outside polling stations, carefully selected polling stations with as much continuity to the last election as is absolutely possible to ensure that you can look at change over those two elections.
And it's been pretty accurate.
So it's not just the polling has done a really good job.
It's also the modelling.
It's also that methodology of looking at change in the same place over time has done a really good job of applying to the UK as a whole.
How does polling deal with the fact that somebody might not tell you the truth or what they really think?
Right.
So when I think about polling, I think about opinion opinion polling companies asking, you know, a thousand people a set of questions.
And then you're taking averages.
So you're making the assumption, I think on the whole, very fairly, that that's going to kind of cancel out, right?
That there's some people are going to give you a nonsense on one side and some people are going to make up something on the other side.
There's always around about 5 or 6% of a survey that you look at that and you think, who are they?
You know, I don't think that's because people are sitting there thinking, all right, well, trick them.
I think it's because people make errors and people rush through things or people give a gut reaction.
And perhaps there's a little bit of being cheeky.
Now, what we do as academics is we use surveys to run statistical models, essentially, on all sorts of outcomes.
And using statistical methodology, you are saying then, well, what's the effect of this that's sort of greater than chance?
And we know there's some noise, we know there's some error, we know that there's there's some kind of fuzziness around the edges, but we're using statistical methods to say, well, this actually is a greater than chance likelihood that that actually does matter.
And do you have any insight into how the political parties use their own polls?
When Rishi Sunak called the election, you know, it was obviously a surprise, but it didn't lack kind of sense, right?
I mean, I thought the Conservatives should have probably have gone to the polls in May because I thought that was the kind of damage limitation timing, electorally speaking.
And then you started to see, oh, there's a bit of economic good news.
Actually, Reform UK didn't do so well.
Maybe on the basis of the local elections, the Labour Party isn't as far ahead as they might have thought before the poll.
So then they were using the culmination of election information and also expectations about how the economy might move, shift people's economic optimism in polls and opinion polls to try to understand essentially what was the most advantageous time.
So that's a classic example, but clearly they're using a lot of set of evidence, including polls.
The spread in the opinion polls at the moment is a very significant factor for why you might then say, Okay, I'm the Conservative Party, I'm gonna be less bullish about this, that, and the other.
Or maybe you're the Labour Party saying, Well, we don't want to take this for granted because we know that there's this divergence, and possibly our leader's at the bottom of that, possibly it's at the top, because we won't know that for a few weeks.
Never a dull moment, it's exciting, Jane.
Thank you very much.
And that's all for this episode.
Next time, we'll be looking at the role of the media.
And you can find more episodes on BBC Sounds.
Just search for understand the UK election.
And if you want to keep up to date with the day-to-day blow-by-blow news from the campaign, then you can listen to my other podcast, Newscast.
See you again soon.
Bye.
When It Hits the Fan is the Radio 4 podcast that takes you behind the gilded closed doors of the often hidden but always influential world of high-end PR.
With me, David Yelland.
And me, Simon Lewis.
Having worked in number 10, having been in Buckingham Palace, it's only when you're in the room that you really know how these crises play out.
And I think that's what our series is able to do: able to put people in the room.
We're not going to be all that indiscreet, but you never know.
So tune in.
That's When It Hits the Fan on BBC Sounds.
Sups!
The new musical has made Tony award-winning history on Broadway.
We demand to be home.
Winner, best score.
We demand to be seen.
Winner, best book.
We demand to be quality.
It's a theatrical masterpiece that's thrilling, inspiring, dazzlingly entertaining, and unquestionably the most emotionally stirring musical this season.
Suffs, playing the Orpheum Theater, October 22nd through November 9th.
Tickets at BroadwaySF.com.
A happy place comes in many colors.
Whatever your color, bring happiness home with Certopro Painters.
Get started today at Certapro.com.
Each Certipro Painters business is independently owned and operated.
Contractor license and registration information is available at Certapro.com.