The Divergence Between Phone and Internet Polls: Which Should We Believe?

Posted on 25 May 2016 by John Curtice

The divergence between the estimates of the relative strength of Remain and Leave in internet polls and those obtained by phone polls has been the most marked feature of polling in the EU referendum to date. Typically, internet polls have suggested that the race between the two sides is very close, whereas those conducted by phone have usually put Remain well, if not necessarily comfortably, ahead. The position has left many scratching their heads trying to work out which (if either) is right.

Today we publish a paper that outlines some of the theories as to why the divergence has arisen, and considers the empirical findings of a number of attempts to compare systematically the findings of phone and internet polls.

There are two main kinds of possible reasons as to why internet and phone polls are uncovering different estimates of the relative strength of Remain and Leave. The first set of reasons concern the circumstances in which respondents are invited to answer how they might vote in the EU referendum. Respondents to a phone poll have to declare their views to an interviewer, whereas those participating in an internet poll have the anonymity that comes from responding to a screen. Meanwhile, in a phone poll a respondent can always say, ‘Don’t Know’, in response to a question, even if it is not offered to them as a possible answer. In an internet poll Don’t Know either has to be offered explicitly as a possible answer or not allowed at all.

It has been suggested that those who support remaining in the EU are more reticent than those who wish to leave, and consequently are more likely to say ‘Don’t Know’ if they are invited to do so. Given that most internet polls do offer Don’t Know as an answer, it is suggested that serves to depress the level of support registered for Remain in such polls. Alternatively, however, it is argued that backing Leave might be regarded by some as a socially unacceptable view in some quarters and that consequently some respondents to a phone poll who hold that view may be reluctant to express it.

The second main possible reason why the two kinds of polls have diverged is that the samples of people that they manage to interview differ significantly. Telephone polls are typically done by ringing landline and mobile polls at random, and, if the phone is answered, securing an interview with someone at the other end of the line. Interviewers are usually given a quota of the kinds of people that they should interview (in terms of their sex, age, etc.) and thus if there is more than one person at the end of the line willing to be interviewed, the interviewer will try to get an interview with whoever best helps them complete their quota. The approach relies heavily (though not completely) on the statistical theory that if a thousand people are surveyed at random, most of the time the estimated proportion of people stating a particular view will be reasonably close to the true proportion in the population as a whole.

Internet polls, in contrast, are typically conducted with respondents who have previously joined a panel of people who have agreed on occasion to complete a poll. When they sign up for that task, they tell the polling company quite a lot about themselves. This means the company can draw from the panel a sample whose demographic and other relevant characteristics are in line with those of the population as a whole. This, it is anticipated, will ensure that the views expressed by this sample will be representative too.

Such different approaches to sampling would certainly appear to create plenty of scope for achieving samples that are rather different from each other. For example, internet polls are often thought to be at risk of over-representing the politically committed whose views may not be typical of those of the population as a whole. Meanwhile, phone polls are reliant on those who are available and willing to respond to their calls over a relatively short period of time and it may be asked whether such people are indeed typical of the general population. In this referendum in particular it has been suggested that the samples obtained  by phone polls contain more graduates, many of whom are relatively relaxed about immigration and are thus more likely to be in favour of Remain – though quite why this should be the case is perhaps not so clear.

The paper assesses the empirical evidence provided by a number of attempts that have been made during the referendum to compare the results obtained by phone and internet polls and to understand why they are divergent. It comes to the view that the explanation probably lies primarily in the differences in the character of the samples that the two kinds of poll achieve rather than as a result of the differences in the way in which they administer their questions. It looks in particular at the claim that phone polls contain more people who are educationally well qualified than internet polls, and suggests that the evidence available to date on this subject is both too limited and inconsistent for us to come to a clear judgement. But given the importance in any case of educational background in helping us to identify who is more likely to be a Remain supporter and who is more likely to want to Leave, it would seem that much more attention should be being paid by pollsters to how many graduates and non-graduates they have in their samples than appears to have been the case so far.

Avatar photo

By John Curtice

John Curtice is Senior Research Fellow at NatCen and at 'UK in a Changing Europe', Professor of Politics at Strathclyde University, and Chief Commentator on the What UK Thinks: EU website.

16 thoughts on “The Divergence Between Phone and Internet Polls: Which Should We Believe?

  1. I already commented on this website but apart from myself nobody is attempting to question validity of any online or phone polls published in the media (and commissioned by the media). I appologise if this is boring but I feel this is an important aspect of social survey process and cannot be ignored. Given that Market Research companies use only a sample, reasonable people will ask how they can claim to be making statements about the population on the basis of partial information obtained from the sample. Is it because they say “… we selected representative sample…” so trust us because we say so, OR we should ask them to SPECIFY exactly what selection procedure they followed when selecting sample, what response rates are and any other detail relevant for sample selection such as SAMPLING FRAME USED to select sample, stratification, etc. Only after we have as many details as possible abut selection procedure we can EVALUATE if poll is valid or not. Another important aspect of online v phone polls debate is the fact that in order to compare two methods you need to design proper experiment where two methods can be compared. Comparing two methods in a way it is done at the moment does not make any sense. The fact that SAMPLING FRAME (that covers whole of UK) for ONLINE SAMPLES DOES NOT EXISTS automatically disqualifies online polls as reliable methodology. And the fact that estimated 50% of UK phone numbers are X-directory and that many households do not have fixed phone makes phone surveys less valid since they do not cover all eligible households.
    I am amazed how majority of commentators about the polls and their methodology ignore basic logical questions that need to be asked when analysing validity of any research based on samples. Leslie Kish, authority about survey sampling said: “”Inferences from an ill-defined or undefined ‘sampled’ populations to a target population and to higher inferential populations would require flights of imagination.” BPS inquiry into GE2015 results concluded that samples were unrepresentative but the report does not mention what sampling frame have been used to select online and telephone samples. For example, You Gov use its own sampling frame of no more than 500,000 volunteers. Considering amount of “research” they do and all research is based on their panel of 500,000 it is not difficult to imagine how samples are drawn for their surveys. You Gov sample (and sample of MR companies that use online panels) will be representative only of those in their database of 500,000 and not of general population of UK. So inferences can be made only about population of 500,000. I do not know how this is difficult to understand and why everybody ignore this fact. There is no way that You Gov and any other similar company can defend making inferences about UK based on the sampling frame of 500,000 volunteers. As Kish said you “… would require flights of imagination” to make these inferences. And we must admit that majority of polling organisations do stretch their imagination when interpreting results of their polls. If You Gov and similar research companies sell research they need to be transparent about their methodology and this transparency requires that they publish profiles of their sampling frame (at least to their clients) and they need to tell potential clients to what population they can make inferences from the sample. General public is led to believe that if You Gov and similar organisations say that “sample is representative” we have to trust them because they say so.
    I hope that professor Curtice will address this question in one of his blog posts.Report

  2. Your analysis, Professor, classifies polls into three periods, the last one being 1.4.16–19.5.16. But if we look at all polls in the month of May 2016 only, there seems to be very little difference between phone and online polls. During May I reckon that REMAIN averaged 45% (margin of error 42-48) LEAVE 41% (39-44) and DK 13% (9-16) pretty much irrespective of whether phone/online.

    [This based on a ‘mixed’ statistical model and excludes the idiosyncratic ORB.]
    Report

  3. Do graduates actually know more about the EU than the non-graduate. It is a question of knowledge built up over the years that prompts the older generation to back LEAVE. The younger generation are more prone to the scaremongering tactics of Cameron and the remainders.
    Common sense tells the more experienced voter who knows that visas free travel to Europe existed long before the EU and to to revoke it would not be to the benefit of any Country.
    How any MP could spread this rumour casts doubts on their educational level.
    The rumour that mobile roaming charges are dependent on EU membership is falsely attributed to it. It was arranged by a world trade organisation memberReport

    1. You are equating education levels with age inferring that education is only relevant to the younger generation in shaping how they vote. There are some major demographic correlators with how people intend to vote, for example age, social grade, and geography, all play extremely important roles but there is no driver more important than education. The longer time spent in education, the more likely a voter will vote ‘remain’. This means that the older generation will be more likely to vote to remain if they have have been educated for longer and more likely to vote for leave if they haven’t.Report

  4. As a person outside the polling industry, but interested in the referendum result, I have found it difficult to follow the What UK Thinks Poll of Polls (PoP) over the last few weeks.
    As I understand it the PoP is a simple average of the last six published polls. This would be fine in normal circumstances but, it seems to me, is much less use with the present online/phone debate. When online polls dominate the PoP the leave share increases and when phone polls dominate the opposite occurs. But surely this tells you nothing of the underlying trend because of the violent fluctuations in the PoP. Another difficulty is there are more online pollsters and they poll more often which, I would have thought, gives a bias to Leave.
    Does What UK Thinks have any comments on the different approach of the Number Cruncher Politics website which gives equal weighting to online and phone polls? Of course if one type of poll is completely wrong and the other right nothing helps.
    Please accept I am not questioning the obvious skill and experience of the people behind What UK Thinks. I am just a curious outsider.
    Report

  5. But given the importance in any case of educational background in helping us to identify who is more likely to be a Remain supporter and who is more likely to want to Leave, it would seem that much more attention should be being paid by pollsters to how many graduates and non-graduates they have in their samples than appears to have been the case so far.

    This is your issue, to say educated people are more likely to vote remain and us thickos are more likely to vote leave, this is utter rubbish. With that comment you are trying to say in some way more educated people understand the issues more and would come to a different opinion than someone less educated.
    What is wrong with phoning someone ask the question and NOT how they were educated this is irrelevant. You are talking UTTER UTTER rubbish. remember PHONE a number and ask how are you going to vote Remain or Leave that’s the answer repeat 1000s of times and thats you sample. STOP all this weighting rubbish. Report

    1. Professor Curtice isn’t trying to say anything of the sort you are suggesting. He is pointing to data that suggests university graduates are more likely to vote to remain than leave:

      “University graduates are keen on remaining in the EU, while those with few if any qualifications are inclined towards Leave. The 2015 British Social Attitudes survey, for example, found that as many as 77% of those with a university degree wanted Britain to continue to be a member of the European Union while just 17% wanted it to withdraw. Conversely, only 45% of those without any educational qualifications wanted Britain to continue to be part of the EU, while 43% wanted it to withdraw. This division proved to be rather sharper than the differences by social class.
      The British Election Study has uncovered a similar pattern. In interviewing it conducted shortly after last year’s general election as part of a very large internet survey that has been returning to the same respondents on a regular basis ever since the beginning of 2014, it found that 58% of graduates wanted Britain to stay in the EU, while 26% wanted to leave. In contrast, amongst those without any educational qualifications, only 29% said they wanted Britain to stay, while 52% backed leaving.”

      It is obviously YOUR issue that people with higher levels of education are more likely to vote to remain than those without lower levels of education. A pollster is trying to obtain a representative sampling of the general population. Education level is one factor to consider as is geographic location, age, party affiliation etc. Professor Curtice points out that details of a respondent’s educational qualifications are not routinely collected by
      polling companies. The suggestion is that phone polls over represent people with higher levels of education. So if anything asking people about their education level and then adjusting the sample to represent the education levels of the general voting population should give a more accurate representation of the how electorate is leaning.Report

      1. *Correction*

        The first sentence of the second paragraph of my post should read

        …people with higher levels of education are more likely to vote to remain than those with lower levels of education.

        YouGov also published data from two polls in May 2016 that supports this observation:

        Educated up to Age 16

        Online Phone
        Remain 20% 22%
        Leave 57 51

        Educated to 17-19

        Remain 37 32
        Leave 39 41

        Educated to 20+

        Remain 57 55
        Leave 24 22Report

  6. Thanks for publishing the paper. What I find incredulous is that YouGov isn’t prepared to publish or acknowledge criticism of its work on its website. For example, I tried to place the following excerpt from John Curtice’s report in the comment section of YouGov’s website only to have their moderator reject it.

    In the words of Professor Curtice:

    YouGov’s work started from the premise that phone polls contain too many graduates. However, far from containing too many people who finished their fulltime education at age 20 or older, the phone poll that was conducted on the company’s behalf actually contained too few such respondents. Just 22% of the sample said that they finished their education at that age, a proportion that was then weighted up to as much as 34%. As a result, far from reducing Remain’s estimated share of the vote, the overall impact of the weighting of the phone poll was to increase it from 43% to 48%. In short, what in fact the exercise really demonstrated is that, contrary to YouGov’s supposition, phone polls do not necessarily contain too many graduates after all.

    YouGov has lost all credibility in my eyes. Report

    1. As a pollster, YouGov was given a grade of C+ for its polling performance in the US by Nate Silver at Fivethirtyeight.com. It would be great if someone ranked the pollsters in the UK in the same manner.Report

  7. The social acceptability of answers given by phone polling interviewees indeed remains a major conundrum for the whole exercise.
    How many Austrians would admit to voting for FPO or Germans to AfD if rung up by a complete stranger right now ? Exactly.
    The status quo acceptable answer is always going to introduce a bias in this situation.
    Ask them tangentially whether they ‘know someone’ who voted for FPO or AfD now they would in many cases give you a completely different answer.

    The latest poll of polls is also biased heavily to phone polls as well at the moment too.

    Meanwhile I’ll continue to make money betting against EU shares. Shorting them to the bottom as we speak. Good luck longs !

    Report

  8. Surely it is more to do with whether there are more online or phone polls in the ‘poll of polls’ list. If the pattern follows as before there will be mostly online polls for the next few weeks which will reduce the remain vote. Then when the phone polls kick in again the remain share will increase once more. As the article above suggests the question is which type of poll is correct.

    Report

  9. I notice that after getting as wide as 55/45 last week, the poll of polls is now tiptoeing back in the direction of 50/50. This seems to be a recurring pattern, and it seems to be driven by scaremongering.

    You get a week of predictions that Brexit will sterilize every male in the country, and things swing to 55/45, and then the hysteria settles down, and we gradually drift back toward the centre. Scaremongering just doesn’t seem to have a long-lasting effect.Report

  10. Does ‘man/woman in the street polling take place any more – or has the EU banned this practice when I wasn’t looking?Report

  11. To add to my first comment also working class i think are more likely not to take part in a phone poll in the evening as, being working class i would not entertain a call of such nature. On the other hand i would take part in an internet poll if one was to pop up on a website, ect. I do think that this polling system is thinking too much about it. I ways say KISS the more you try to allow for this and that the more you screwy the end result. Report

  12. You are missing a vital point there are millions of working class that can not or do not have a phone at work in normal working hours. Also many people now do not allow unsolicited calls to their landline or mobiles. There are more working class people that none of your phone polls will sample. I would therefore say internet polls are more reliable as they are available for a longer period and therefore allows more working class people to take part. Report

Leave a Reply

Your email address will not be published. Required fields are marked *