Are Phone Polls More Accurate than Internet Polls in the EU Referendum?

Posted on 30 March 2016 by John Curtice

One of the striking features of polling of referendum vote intentions to date has been that polls conducted by phone have been producing more favourable results for Remain than those undertaken via the internet. Typically, once Don’t Knows are left to one side, the former have been suggesting that support for Remain is around the 60% mark (and thus that for Leave at 40%), whereas the latter have been putting the two sides more or less neck and neck.

Inevitably, this has raised the question of which set of polling numbers (if either) should be believed. Providing an answer has not been easy, because close examination of the detailed tables for the two sets of polls simply uncovers the fact that phone polls find more Remain voters than internet polls do within more or less every category of voter. Thus, for example, both sets of polls invariably find that younger voters are keener than older voters on remaining in the EU. It is just that phone polls have typically been finding around 10% more Remain voters amongst younger voters than have internet polls, and 10% more amongst older voters too.

However, a new paper published today jointly by Populus and Number Cruncher Politics, reports the results of some methodological experimentation that attempts to explain the discrepancy between the two sets of polls  – and give us some handle as to which might be closer to the truth.

Hypotheses and Evidence

The paper sensibly starts from the premise that, given that differences between the two sets of polls are evident within each of the various demographic characteristics by which pollsters commonly tabulate (and weight) their samples, the answer to the divergence must lie elsewhere. They look at two possibilities. The first is that it arises as a result of differences between the two kinds of polls in the number of ‘Don’t Knows’ that they obtain. The second is that it is occasioned by important differences in the composition of their samples that are not captured by the pollsters’ standard demographic variables.

On the first point, the paper reports the outcome of two experiments, one conducted on a phone poll, the other on an internet one, in which roughly half the respondents were offered ‘Don’t Know’ as a possible answer (as is typically the case in internet polls), and half were not (or in the case of the internet sample, it was only available in small type at the bottom of the relevant page). It finds that in both cases more people said ‘Don’t Know’ when it was offered as an option – and that in both instances support for Remain was markedly higher when it was not offered. The implication is that those who say ‘Don’t Know’ tend to be closet supporters of the status quo, and thus polls (such as most internet polls) that offer that option are consequently at risk of underestimating support for Remain.

On the second point, the paper notes that in the academic British Election Study (BES), those who favour remaining in the EU are also more likely to give a socially liberal answer in response to questions on whether attempts to introduce greater equality for women and for racial minorities had gone too far or not gone far enough, and that they were also more likely to say they were British rather than English. Populus thus included these questions on both a phone and an internet poll of referendum vote intentions, and weighted the resulting samples so that they matched the distribution of responses to these three questions obtained by the face-to-face random probability survey conducted as part of the BES (and which was relatively successful in replicating the 2015 UK general election result). This weighting reduced what was otherwise an eleven point difference between the two surveys in Remain’s estimated lead over Leave to one of just three points.

Evaluation

This paper undoubtedly makes a most welcome contribution to what hitherto has been an unresolved puzzle. But there are some questions to be asked, both about how far it really does help explain the difference in the estimates being obtained by internet and phone polls, and certainly whether its evidence justifies the assertion that the ‘true state of public opinion’ is ‘closer’ to the picture obtained by phone than by internet polls.

Let us look first at the evidence on the prevalence and impact of Don’t Knows. It is certainly the case that most internet polls report more people saying ‘Don’t Know’ than do phone polls. But this is not invariably the case.

First, in two polls that it conducted in February and March by phone, Survation reported 19% saying Don’t Know, a figure that was little different from the 18% and 21% that the company previously reported in two internet polls it conducted in December and January. Yet at 58%, the average level of support for Remain in the two phone polls (after the Don’t Knows were left aside) was still markedly higher than the average of 48% obtained by that company in its two previous internet polls.

Second, there is one company, ORB, whose internet polls have not contained any Don’t Knows – because that option was not available at all. Yet in the six polls it administered in that way, support for Remain stood on average at just 51% – little different from the figure being obtained by other internet polls with many more Don’t Knows.

Meanwhile, if it were the case that those who say Don’t Know to pollsters are disproportionately closet supporters of Remain, we should find that when pollsters attempt to ‘squeeze’ their respondents by asking them which option they were more likely to back, Remain should emerge with a clear lead. Yet in the handful of polls where an attempt has been made to squeeze the Don’t Knows, this is not the picture that emerges. In its most recent phone poll released yesterday, Ipsos MORI found only a modest Remain lead of 32% to 25% amongst those who initially said Don’t Know, while in its recent phone poll ORB found this group leaning slightly toward Leave by 37% to 31%.  Meanwhile in two internet polls conducted recently by BMG the Don’t Knows broke almost evenly, when squeezed, with 23% on average inclining towards Remain and 21% towards Leave.

In any event, even if we are willing to accept that those who say Don’t Know are more likely to support Remain than Leave, it is still far from clear that we should assume that polls that have a lower proportion of Don’t Knows are more likely to be accurate. As both the most recent Ipsos MORI and the most recent ICM poll exemplify (both companies have now started collecting information on reported likelihood of voting), those who say Don’t Know are less likely to say they will make it to the polling station. That implies that a poll that fails to attempt to estimate or model likely turnout while limiting the number of Don’t Knows could be giving too much weight to a group of voters who will prove less likely to cast a vote.

But what of the paper’s evidence on sample composition? Here we discover that, as compared with the BES face-to-face random probability sample, an online poll conducted by Populous as part of the methodological experiment proved to be markedly less socially liberal, while an identical one conducted by phone proved to be more liberal. In short, neither exercise proved to be particularly accurate as compared with the BES, though in fact it was the phone sample that was the slightly more astray. Meanwhile, we should bear in mind that the advocates of internet polling at least would argue that the reason why their samples appear less socially liberal than those conducted by phone or face-to-face is because, without an interviewer present, respondents feel less inhibited about giving what they think is a socially unacceptable response. In other words, the difference between the internet poll and the BES could be a consequence of the mode of interviewing rather than because of a difference in the composition of their samples.

Conclusion

In short, while the paper makes a valuable step forward in identifying the importance of obtaining in EU referendum polls samples that are representative of the balance of social liberalism and social conservatism in our society, it is less clear that it has established that phone polls are proving more accurate than internet polls in that regard. If anything, the paper would simply seem to point to the conclusion that both may be failing to do so, albeit in different ways.

Meanwhile, we should note, somewhat ironically, that the gap between internet and phone polls has, in fact, now suddenly narrowed. In the phone polls they have conducted this month, ComRes, Ipsos MORI and Survation have on average put Remain on just 55% (once Don’t Knows are left to one side), down five points on the equivalent figure for February. Meanwhile in the first phone poll that they have conducted, ORB actually put Leave narrowly ahead this month by 51% to 49%. In contrast there is no sign of any equivalent movement in those polls conducted via the internet. In their three most recent polls, ICM, TNS and YouGov put Remain on average on 51% and Leave on 49%, little different from the picture of an even contest that has been painted throughout by internet polls. Perhaps this is a sign that as the referendum comes closer into view and voters begin to develop rather firmer views, how the polls are conducted will come to matter less, much as proved to be the case in the Scottish independence referendum.

Alas, however, that will be no guarantee that they will be right.

Avatar photo

By John Curtice

John Curtice is Senior Research Fellow at NatCen and at 'UK in a Changing Europe', Professor of Politics at Strathclyde University, and Chief Commentator on the What UK Thinks: EU website.

14 thoughts on “Are Phone Polls More Accurate than Internet Polls in the EU Referendum?

  1. Back to the original question of why there is a difference between telephone and on-line polls… I am no expert on all this but could it simply be down to the fact that on-line polls require the recipient to respond whereas a telephone poll, presumably outbound from the polling company, requires the recipient only to say yes or no (or don’t know). If, as seems possible, the Brexiteers (in the round) are a more motivated group, they are more likely to take the trouble to respond to an on-line poll. As the referendum itself require the voter to take action (indeed rather more as they have to go to a polling station or complete a postal vote), might this mean that the on-line polls are a better judge of the possible result?Report

  2. It seems close now. allthough i reckon the yes will win by between 3 &19%.
    the bookies are a good place to go if you want to make a bit of money. may i point out that if Britain votes to leave. then we will have to pay for a new passport, because it has the words e.u. citizen as the heading on the front cover. . i am going to vote to say. 10 years ago I might have been in the br exit camp. However, i know better. why can i sit in a Belgium bar in brussels and smoke inside the bar? i cant smoke inside any British bar. British politicians made our smoking law. most people are working so hard that they don’t know what is happening with European directive v English statutory laws. mobile phone costs down in europe. and the eurovision song contest this year, 2 out of all the songs not in English, presented in Englis .. English taught in European school children. . we need to negotiate with the e.u and not just walk away. we have good chess pieces. Report

  3. Another interesting point. First I must say that I’m voting to leave, along with my wife and 34 of our friends.
    We will all vote leave at any cost, regardless of weather conditions on the day. We have even offered each other free transport to the voting station. I have noticed that some of our friends that want to stay in the EU
    said that if it was a wet day they won’t be voting. Thinking about those who do vote I notice that the young are less likely to vote at any time and we older people do tend to vote.Report

    1. Thanks for the interesting point. Can we have some stats related to your personal poll? To start with, it would be good to know geographical location of your sample, how many males and females your sample have, age distribution of your sample, working status distribution (working, not working, pensioner) and to which population you are projecting your results. Reading your response I see that you classified your sample as being “older people” and you infer that these “older” people are ones who are more likely to vote and will probably vote NO compared to “younger people” who will probably, according to your poll vote YES (this YES depends a lot on weather conditions). One of the conclusions from your post is that if there is a DRY weather YES vote have some (but slim) chance of winning and if it is RAIN then NO vote will have more chance of wining.
      Words of encouragement: Your poll used the same methodology as You Gov or some other research company mentioned in the blog post so your results are as valid as theirs.

      Report

      1. Hello, I’m not taking a poll, just talking to friends and work colleagues over the last few weeks.
        The area is Hereford and our friends are all over 40. more or less an even split between men and women.
        We are all married. Some have told us of their teenage children and if they will vote or not.
        Some of the people we know do not vote on a regular basis and have been known to not vote if it’s wet.
        We that want out are very keen to vote as we see the EU as a failed project .
        So my poll, if you want to call it that, is a real life example of people we know will and will not vote.
        I hope over the coming weeks to get more people to vote leave and free transport is offered to the polling station.Report

  4. The problem is that outers are all over blogs and other comment boards like a rash in the way that remainers aren’t. If you’d read all the posted comments in Thanet prior to the general election it would have been impossible to come to the view that anyone other than Nigel Farage would have won. We all know the actual outcome. It’s generally true of the far right that they now seem to monopolise blogs but in every actual vote they don’t have much of an impact. Also Dover is hardly representative of the U.K. Report

    1. This is almost a mirror image of the Scottish Referendum. YES were probably 80-20 on social media etc. and the polls were predicting a much closer race than it turned out to be. The issue is that OUT are talking to each other and convincing themselves that there is momentum and they will win and it becomes a self-fulfilling prophecy for them as they think everyone they speak to is the same as them. I think the same may be true here and the reality of the vote will be a reasonably comfortable win for Remain. The weather, voter turnout even how well England are doing in Euro 2016 will have a part to play but I think OUT have a mountain to climb. If I was them I would be very wary of them taking a lead in the polls as that will drive out the soft remain vote who would ideally like to be apathetic about the whole thing and it won’t even be close.Report

  5. I am amazed how majority of commentators about the polls and their methodology ignore basic logical questions that need to be asked when analysing validity of any research based on samples. Leslie Kish, authority about survey sampling said: “”Inferences from an ill-defined or undefined ‘sampled’ populations to a target population and to higher inferential populations would require flights of imagination.” BPS inquiry into GE2015 results concluded that samples were unrepresentative but the report does not mention what sampling frame have been used to select online and telephone samples. For example, You Gov use its own sampling frame of no more than 500,000 volunteers. Considering amount of “research” they do and all research is based on their panel of 500,000 it is not difficult to imagine how samples are drawn for their surveys. You Gov sample (and sample of MR companies that use online panels) will be representative only of those in their database of 500,000 and not of general population of UK. So inferences can be made only about population of 500,000. I do not know how this is difficult to understand and why everybody ignore this fact. There is no way that You Gov and any other similar company can defend making inferences about UK based on the sampling frame of 500,000 volunteers. As Kish said you “… would require flights of imagination” to make these inferences. And we must admit that majority of polling organisations do stretch their imagination when interpreting results of their polls. If You Gov and similar research companies sell research they need to be transparent about their methodology and this transparency requires that they publish profiles of their sampling frame (at least to their clients) and they need to tell potential clients to what population they can make inferences from the sample. General public is led to believe that if You Gov and similar organisations say that “sample is representative” we have to trust them because they say so.
    I hope that professor Curtice will address this question in one of his blog posts.Report

  6. Not really a comment on this column, but I have been thinking recently that with the vote balanced as finely as your poll of polls seems to suggest, it would only take some striking event in the period between now and the vote to swing things one way or another.

    You could say that about any close vote, but the referendum seems a bit special in that it is really a vote about credibility. Relatively few voters can do their own research into the supposed benefits or losses of Brexit, so they have to decide which side’s claims to believe. Either side’s claims have to be taken on faith, yet each voter will have their own underlying bias for or against Brexit.

    So some startling event could supply a voter with a pretext to believe one side of the debate over the other, when examining the competing claims might be inconclusive. And my sense is that any unexpected event is more likely to favour Brexit than to benefit Remain.Report

    1. Yes, I agree with your view that a significant event could well swing it, and more likely in the leave direction.

      I am surprised and disappointed that UK sovereignty – specifically English sovereignty – is not one of the major topics which is cited as a remain/leave factor. There seems to be a lack of understanding that 1000 years of history culminating in Common Law is being over – ridden by the continental tendency to Roman or Napoleonic law. This is the very reason that the EU seeks to regulate every aspect of our lives, because you can do nothing if not ‘allowed’ to do so.

      Is it too much to hope that a realisation of this could swing it to Leave?Report

  7. I can’t imagine why anyone would bother to go to an on-line poll just to say they ‘Dont Know’.

    From my own research into public comments on internet news items regarding the EU, the Outers are massively ahead.

    The ones that are really passionate and comment on news items and will vote come hell or high water are the Outers.

    In a phone poll there is no knowing who will pick up the phone and answer. Just as likely to get someone who will vote as someone who wont bother.
    At the end of the day, the only poll that matters is the one on 23rd June and my firm belief, for what its worth, is that the Outers will have it.Report

    1. You make an interesting point Phillip but voting doesn’t measure strenght of feeling, so the passionate Leave voters may make more noise but that doesn’t mean there are more of them.Report

Leave a Reply

Your email address will not be published. Required fields are marked *