A Scintilla of Movement to Leave?

Posted on 8 June 2016 by John Curtice

The week is only half over, yet it has already been one of the more dramatic so far as referendum polling is concerned. Two polls have claimed that Leave now enjoy a record or near record lead, while two others contained some evidence that Leave may have gained some ground. Perhaps this is the first week in which the balance of opinion has genuinely moved?

The first poll to put Leave on a high came from the internet polling company, YouGov on Monday (for Good Morning Britain). Once Don’t Knows are put to one side it put Leave on 52%, Remain on 48%. It was the first time the company had reckoned Leave were that far ahead since the beginning of February. This was followed just hours later by the publication of another online reading, this time from ICM. This put Leave even further ahead by 53% to 47%. Never before had ICM put Leave that far ahead.

Meanwhile a couple of other polls at around the same time could also be read as providing evidence of a swing to Leave. The first was an online poll from Opinium released on Sunday. Now, as it had happened, Opinium had decided to adopt a methodological change, the detail of which we discuss below. Here we should simply note that the effect of the change was to turn what would otherwise have been a 4 point swing to Leave since the company’s previous poll, enough to put Leave ahead by 52% to 48% (as the poll was reported by The Observer), into just a 1 point movement, leaving Remain still narrowly ahead.

The second came from a phone poll conducted by ORB and published on Tuesday. As we have discussed previously, in reporting ORB’s polls the Daily Telegraph tends to focus on the figures for those who say they are certain to vote, rather than those for all respondents. As it happens, the figures in ORB’s latest poll for those who said they are certain to vote were particularly dramatic. They put Remain and Leave on 50% each, representing a three point swing to Leave and a much better result for Leave than that obtained by most phone polls.

All in all this looks like a substantial body of evidence supporting the idea that Leave have made some progress. However, there are some caveats and cautions to take on board too.

First of all, a second poll from YouGov that was released on Tuesday (in The Times) and conducted a couple of days later than the poll that appeared on Monday, failed to replicate the swing to Leave in that earlier poll. Remain were credited with 51%, Leave with 49%, well in line with the readings of many another YouGov poll.

Second, while ORB’s figures for those certain to vote put the two sides neck and neck, those for all respondents told a very different story. They put Remain on 57% and Leave on 43%, very much in line with the figures in many another phone poll and actually representing a two point swing to Remain as compared with the company’s previous poll last week. While it is quite common for polls (both internet and phone) to find that Leave supporters are more likely than Remain voters to say they will make it to the polls, the gap between the two in this poll was unusually big (69% of Leave supporters said they would vote, compared with just 54% of Remain supporters). There must be a suspicion that the chance variation to which all polls are subject may have helped on this instance to exaggerate the difference in the relative propensity of the two sets of supporters to participate in the ballot on June 23rd.

Third, we need to consider the possible implications of Opnium’s decision to change their weighting strategy. This change consisted (primarily) of weighting their data by some indicators of social conservatism and national identity. Both of these characteristics have been shown to be correlated with attitudes towards the EU, while some previous experimental work undertaken by Populus and Number Cruncher Politics has suggested that internet polls may be obtaining samples that are too socially conservative. Opinium appear to have decided that this insight has some validity and their weighting seems designed to correct what they now accept may be samples that are too socially conservative. If they are right, and their insight is also true of other company’s internet polls, then of course it implies that some of the leads for Leave obtained in other polls this week may exaggerate its strength.

So how might we make sense of this apparently contradictory evidence? Well, it is worth bearing in mind that one of the reasons why some commentators have been inclined to believe that there has been a swing to Leave during the last fortnight is that the Leave side has found it easier to influence the media agenda since the onset of ‘purdah’ on May  27th. No longer has the UK government been able to command the agenda by successively publishing papers written and researched by the civil service and warning of the allegedly dire consequences of leaving.

If the onset of purdah has made a difference then we should be able to discern a difference between the results of all of the polls conducted since May 27th and those undertaken beforehand. Eleven internet polls of referendum voting intention were conducted between the beginning of May and the onset of purdah (one of which is a poll by TNS that was only released on Monday). On average those polls put Remain on 50%, Leave 50%, exactly in line with the average position in the internet polls ever since last September. In the half dozen internet polls conducted since May 27th, Leave has averaged 51%, Remain 49%. So maybe, there has  been just a scintilla of movement in the direction of Leave. However, we have to bear in mind that as just three phone polls have been conducted since May 27th, including the remarkable ICM poll that put Leave ahead, we have too little evidence from this kind of polling to confirm this impression.

A couple of other points to bear in mind from this week’s polling. First, Opinium are not the only company in recent weeks to have made methodological tweaks that have been reported as making the results more favourable to Remain. So also have ICM and YouGov in their internet polls and ComRes in their phone polls. Given the divergence between internet and phone polls in this referendum and the difficulties that the polls had in estimating Conservative and Labour support in last year’s general election, it is not surprising that the pollsters should be constantly trying to improve their methods. However, they may need to be wary of any temptation to tweak their polls so that they all ‘herd’ in the direction of what they think the result will be. In any event, the fact that Leave have edged ahead in the most recent internet polls despite these methodological tweaks lends a little greater weight to the suggestion that there might have been a scintilla of movement in its favour.

Second, as previous readers will be aware, we have argued here that, given it is one of the key demographic divides in this referendum, pollsters should be paying greater attention to the composition of their samples in terms of their respondents’ educational attainment. A couple of this week’s polls have provided information on this feature of their samples. In the second of the polls that they published this week, YouGov reported that (after weighting, though the weighting did not have a substantial effect) 27% of their sample had a degree and another 8% some form of higher education other than degree. These figures are not too dissimilar to the nearest equivalent figures in the most recent British Social Attitudes survey (conducted face to face and using random probability sampling), which estimates that 24% of all adults are graduates while those with some other form of higher education constitute another 11%. This similarity provides some initial assurance that internet polls do not necessarily contain too few graduates, and that their tendency to report a lower vote for Remain is not occasioned by any such under-representation.

TNS, meanwhile, (in their poll that was released on Monday but was conducted a while ago) reported that 28% of their sample were graduates, initially suggesting at least that its samples too do not suffer from a deficit of graduates. However, their question on educational qualifications seemingly did not differentiate between the wide range of other qualifications that people might have, raising some doubt, perhaps, about how those with other forms of higher education might have classified themselves. Meanwhile, at 10%, the proportion without any qualifications in TNS’s sample is markedly lower than the 18% to be found in the most recent BSA – but given that this is one of the groups that is least likely to vote for Remain any underrepresentation there might be of this group certainly cannot help account for that this poll put Leave ahead by 51% to 49%. Hopefully, future polls will also tell us more about the educational background of their respondents.

 

Avatar photo

By John Curtice

John Curtice is Senior Research Fellow at NatCen and at 'UK in a Changing Europe', Professor of Politics at Strathclyde University, and Chief Commentator on the What UK Thinks: EU website.

20 thoughts on “A Scintilla of Movement to Leave?

  1. I don’t know how significant this is, but the Independent newspaper is reporting today that it commissioned ORB to do a referendum survey, and they reported 55/45 for Brexit. Maybe Prof. Curtice will comment on the methodology of the poll.Report

    1. Prof. Curtice will have difficulties in commenting ORB poll since ORB did not publish relevant information about methodology. What is available on their website is summary of tables for each survey question. Not to repeat myself, but all these online polling is complete joke. How can we judge if the poll is valid if we do not have any information about methodology? We cannot simply say, yes I trust you because you said you conducted the poll. Market research companies are eroding reputation of research industry.Report

      1. Sherlock have you never heard the old sore ”There are lies, damn lies and then there are statistics”

        We will only know for sure on the 24th Report

  2. Is there a case for certain polling companies to have a vested or biased interest in the outcome.

    I could see this happening in one of two ways.

    One side, believing it to be losing the argument, pays a pollster to big-up the results one way or another…..the other side, if it feels it is winning, may pay a pollster to frig/weight the results to suggest an under-dog situation thereby putting the frightners on undecided voters to get out and vote.

    Not sure this happens, but, I’ve seen so many vested interest groups in my life, I’m cynical enough to believe that the winning or losing side will try and use pollsters in any beneficial way they can . Thoughts please ?Report

  3. There should be a poll (telephone & Online) to determine what proportion of the general public go out of their way to take part in Online polls.
    I would say the Online results would be greatly exaggerated.Report

    1. You cannot use telephone or online sample based on dodgy sampling frames to determine proportion you are after. Online surveys use volunteer databases to select samples to be representative of age, gender, education, working status, etc. So samples selected for online surveys represent only people who volunteered to be on this database. Inferences from this sample about general UK population require “flights of imagination” (quote from Leslie Kish, authority on Survey Sampling techniques). Any conclusion from online poll based on sample from volunteers will not be valid. Clients who pay for online polls which require generalisation about UK population do not understand basic concepts of sampling and surveys and they are throwing their money. And research companies who provide online polls should say to their clients what type of product they are buying i.e. product that should not be published in national newspapers, TV, radio, social networks, etc. Actually product (online polls) should not be offered to clients.

      Telephone sample is slightly better than online but with so many X-directory numbers and a lot of households without fixed line (mobile only households) makes telephone survey less reliable. Telephone samples have very low response rates (they are very often 10-15% of issued sample).

      Example: You Gov have database of around 500,000 volunteers who are ready to answer any of their polls. It would be interesting to know demographic and geographical profile of this database. This database is base for You Gov polls and surveys.
      Other research companies that use online polls use similar databases.

      Report

      1. I suspect these data bases attract to those who have strong opinions , rely heavily on information and reserch in decision making, are politically committed or for some reason they wish to influence outcomes in some way . These don’t sound like the average person to me ,in my experience who are often rather disengaged , bored with government or followers of the herd . I also suspect the databases may be slightly more better educated or at least more likely to value to the idea of of education , information and research [ than someone say stopped in the street at random ].

        I may be wrong but i would expect such groups to have a slight bias towards remain since they tend to favour education and are rather akin to the people like to be “informed ” when they buy something Eg those who read or follow Which? or the GH.In my experience the people tend to be a little more middle calls and tend to favour posh and often pricey shops like Lakeland and John Lewis who offering customer service and quality but are not price competitive .
        In addition those who participate willingly in research are likely to value the ideas and information and arguably more likely to reflect the the views as the better educated -even if they aren’t [ the bias may be very small ]Report

  4. Changing “methodologies”, changing “weighting strategies”, changing this and changing that. Why would anyone trust any of the published polls? Because they say “trust us, we know what we do”. We do not have a slightest idea how they selected sample and how they collected responses. They will never publish any details about sampling methodologies used. You do not have to be genius to find out what methodologies they employ to select sample, just go to website of any above listed polling companies and you will find very strange and very short explanations about sample selections. Ignoring dodgy sampling methodologies employed for these polls it is pretty clear they all concentrate around 50%. Why? Because all these “weighting strategies” are used to “adjust” unweighted data. They can start adjusting and massaging unweighted data however they want, sometimes it will put Leave ahead for 1%-2% and sometimes Remain 1%-2% or to make things more interesting even 3% ahead. The % difference depends on the competitors and they polling results; as long as you are around 50% +/- 1-3% you are safe regardless of the final results.

    I want to see the following information about the polls before making any inferences about validity of their results:

    1. Sampling frame used (i.e. population surveyed)
    2. Issued sample size
    3. Achieved sample size
    4. Response rates
    5. All details about sample selection and data collection
    6. Margin of error
    7. Who paid for the survey
    8. Weighting procedures (details)
    9. Unweighted and weighted results

    It is not enough to say “… we selected representative sample of 1,000 adults aged 18+ …”

    Details, details and details about polling methodology, please!
    Report

    1. I agree . Another consideration may be important . There seem to be local and regional differences between respondents . In certain areas or towns [ e.g. Ports and areas which once relied on fishing e.g. Hull, Cornwall ,Hebrides etc ] leave voters are more prevalent than might be expected . On the other hand university towns [ e.g. Oxford Bath and Scotland ] seem to be more inclined towards stay. So if one of these type of areas is over or under represernted it could skew the result. How good is the geographical distribution of the samples and do they take account of these potential biases ? We or at least I don’t know .Report

  5. I can only go by my own behaviour patterns and those of people I don’t know, but isn’t it more the case that a significant swing might REDUCE the incentive for the “herd” to vote with the majority (who will do the work for them), but might on the other hand give the lazier voters inclined in the other direction an impulse to actually vote? And what about the good old British tradition of supporting the underdog?Report

  6. At this stage of the campaign many of the ‘don’t knows’ will be the less engaged (for whatever reason), therefore less aware or understanding of the opposing arguments, and almost by definition indecisive. In reality many are looking for somebody to decide for them. If the polls do break significantly in any direction I expect a substantial ‘herd instinct’ to follow in that direction.Report

  7. It seems to me that ignoring don’t knows when the proportion of them in the total is so high totally invalidates any conclusions one might draw. I’m sure that with the publicity building (and see parallel pressure in potentially extending the registration deadline) many don’t knows will turn into voters, and many will turn in to Remain voters, because of a bias to the status quo (rather than that the Remain arguments are necessarily more compelling). I suspect that Remain will get at least 5% points of incremental support from current don’t knows. The more adjustments given, for everything from declared likelihood to vote – when this is bound to go up on polling day to adjustments for every conceivable type of categorisation of the voters (star sign, anyone?) means the the reported poll numbers become impenetrable.
    Personally, I am looking at phone polls only, and adding all the don’t knows into Remain, and that is staying remarkably consistent at 60:40. Clearly this would overstate Remain, but my guess is that Remain will win by at least 55:45 and will probably get close to 60%.Report

    1. I would tend to agree that the DKs are absolutely crucial and agree with other comments that they are less engaged but will get to some level of interest before polling day.

      I also think (and it is pure opinion) that the final result will be somewhere between 55 and 60 for Remain as I expect that DKs will break for Status Quo (hard to say how much but I think 60/40 and maybe not much more) and, actually, soft remain will not fly to Leave as much as soft leave will fly to Remain in the privacy of the Polling Booth (the “devil you know” aspect). In a Yes/No scenario this will perhaps be worth a couple of points.
      Report

      1. I think Remain has to get at least 60 + 2:1 among the young to put the matter to bed.

        To save me checking all the various pollsters’ tables, I wonder whether Prof Curtice (or one of his colleagues) could let us know whether these polls include voters from Northern Ireland and ex-pats? Either way, could they offer an opinion of how these two cohorts are likely to vote?

        I was suprised that a poll in Gibraltar had support for Remain at 88%.
        http://chronicle.gi/2016/04/gibraltar-will-vote-to-remain-in-eu-poll/Report

        1. Gib is very loyal to GB, but it’s outlook is not anti EU. Gib’s problems are the years of border and air space disputes with Spain.Report

          1. … which I guess will get worse if the UK leaves the EU because then Gibraltar has to go as well.Report

    2. To pwgc I hate to say that I have some sympathy with your view; the level of engagement in a certain section of the population can be as low as “what is this EU thing”. In my optimistic moments I feel that few of such people will vote. In my blacker moments…… what could sway it is an expressed view that the price of fags will rise if we leave.

      Brain transplants should be available on the NHS…?Report

Leave a Reply

Your email address will not be published. Required fields are marked *