Are Leave Voters Less Knowledgeable About The EU Than Remain Voters?

Posted on 23 November 2018 by Noah Carl

In a forthcoming paper, my colleagues and I report the results of a study comparing levels of EU knowledge among Leave voters and Remain voters. We gave a 15-item EU knowledge quiz to nationally representative sample of the British population via an online survey. Our quiz comprised 9 ‘ideologically neutral’ items, as well as 6 items that we deemed more ‘ideologically convenient’ for one side or the other.

Each item comprised a positive statement, followed by the question ‘True or false?’ For example, one item read: ‘The British government cannot sign free trade deals while Britain is a member of the EU. True or false?’. The correct answer is ‘true’, and since being able to sign free trade deals is generally seen as a good thing, we deemed this item more ‘ideologically convenient’ for Leave voters. Another item read: ‘More than ten per cent of British government spending goes to the EU. True or false?’ The correct answer is ‘false’, and since spending less on something represents value for money, we deemed this item more ‘ideologically convenient’ for Remain voters.

Our study uncovered three key findings. First, despite the fact that Remain voters scored higher on a short test of probability reasoning (a result that accords with what has been previously reported), there was no average difference in EU knowledge between the two groups. (In our Supporting Information, we report the results of a separate EU knowledge quiz from an earlier wave of our survey, which yielded similar results.) This finding is contrary to a presumption made by some commentators that Leave voters were less well informed than Remain voters.

Second, both Leave and Remain voters were more likely to answer correctly on items that were ‘ideologically convenient’ for them, a finding for which there are two plausible explanations. Respondents may have engaged in ‘motivated reasoning’: when confronted with an item to which a particular respondent did not know the answer, she may have selected whichever option was most psychologically comforting for her, given her ideological priors. Respondents may also have engaged in ‘motivated information seeking’: in the months leading up to the quiz, they may have preferentially consumed sources of information that flattered their ideological priors, leading them to acquire more knowledge about their own side than about the opposing side.

Third, characteristics such as older age, higher education and stronger political interest were quite strongly associated with political knowledge on the 9 ‘ideologically neutral’ items, but were only weakly associated with political knowledge on the 6 ‘ideologically convenient’ items. One possible explanation for this discrepancy is that ideological bias (i.e., the tendency for partisans to give ‘ideologically convenient’ answers) overwhelmed the advantages that would otherwise have been conferred by older age, higher education etc.

It is important to make clear that our finding of no average difference between Leave and Remain voters does not imply that voters overall were well informed prior to making their decisions. Nor does it rule out the possibility that misinformation had a decisive impact on the result of the referendum, as some commentators have maintained.

Furthermore, it is worth comparing the results of our study to the findings of the Brexit Misperceptions study, which was published a couple of weeks ago. While the authors of that study did not report an average EU knowledge score for Leave voters or Remain voters, they did find that Leave voters were less accurate than Remain voters in most (but not all) of the domains they asked about.

What explains the disparity between our results and theirs? One possibility is that it stems from differences in our respective samples. For example, our sample might have happened to include more knowledgeable Leave voters. Although both were online surveys, ours was carried out by Kantar, whereas theirs was carried out by Ipsos MORI. In addition, ours was done in the spring, whereas theirs was done in October.

A more interesting possibility is that the disparity stems from the fact that they included a much greater number of items that were more ‘ideologically convenient’ for Remain voters. For example, they asked whether immigration from Europe had increased crime, decreased the quality of healthcare services, and increased unemployment among low-skilled workers. And on the basis of the recent Migratory Advisory Council (MAC) report on European Economic Area (EEA) migration, they determined the correct answer to be “no” in all three cases. Unsurprisingly perhaps, they found that Leave voters were much less accurate than Remain voters.

While it is of considerable interest to gauge public beliefs about the impact of EEA migration, there are obvious disadvantages to asking questions in this way. First, although the MAC report should be taken very seriously, statements about the causal impact of EEA migration cannot be considered definitive in the same way as statements about, say, which countries are members of the EU (items 1-3 in our quiz). In other words, nobody can dispute that Austria is a member of the EU, but there can be reasonable disagreement (up to a point) about the causal impact of EEA migration.

To take one example, the MAC report states that “there is no evidence that migration has affected crime” (p. 99). However, on p. 103 it concedes that EEA migrants “are more likely to receive a caution/conviction than their share of the population would suggest” (see Table 6.1 on p. 104). The report suggests that this discrepancy is attributable to the fact that EEA migrants are more likely to be young and more likely to be male, which seems altogether plausible. Yet this implies that there are two different ways of interpreting a question about the impact of EEA migration: as the gross effect, or as the net effect after taking demographic characteristics into account. (And note that the former interpretation arguably makes more sense given that the gross effect is what people actually observe.)

Second, it is possible that the causal impact of EEA migration on some macro-variable (e.g., crime) might be small but non-zero. And in that case, there would be no clear right or wrong answer to a question about the direction of impact. Strictly speaking, even if the true effect were miniscule, it would still be correct to say “EEA migration has increased crime”. Yet for all practical purposes, an answer of “EEA migration has not increased crime” would suffice.

Of course, this is not to say that the authors of the Brexit Misperceptions study took an invalid approach in choosing to focus on the impact of EEA migration. As a matter of fact, the disparity between our results and theirs highlights an interesting methodological choice facing researchers attempting to compare levels of political knowledge among groups of partisans. When it comes to ideologically fraught topics like Brexit, there are several options available: include an equal number of questions that are more ‘ideologically convenient’ for each group; ask a sufficiently large number of questions to cover all relevant aspects of the phenomenon; or ask questions in such a way as to minimise the impact of ‘ideological convenience’ (e.g., ask for precise quantitative values; but see here).

The major disadvantage of the first option is that one may end up incorrectly appraising the respective levels of political knowledge in different groups as a result of artificially equalizing the opportunities for partisans to give ‘ideologically convenient’ answers. The disadvantages of the second and third options are twofold. First, it may be costly to ask such a large number of questions, or to deviate from a simple true-or-false question format. Second, it may be difficult to formulate questions about certain aspects of a phenomenon that are nonetheless very important. This point is illustrated by the above discussion about EEA migration, but consider how hard it would be to assess people’s knowledge about, say, the long-term value of national sovereignty.

My colleagues and I hope that our study will prompt other researchers to conduct their own surveys of the British public’s EU knowledge, as well as to consider how best to deal with the issue of ideologically convenient responding.

By Noah Carl

Noah Carl is a postdoctoral research fellow at St Edmund's College, Cambridge

2 thoughts on “Are Leave Voters Less Knowledgeable About The EU Than Remain Voters?

  1. I think both studies illustrate the risks of quantitative methods such as online surveys.
    Firstly, the wording choice might be interpreted in a number of ways. A single crime from a migrant ‘increases crime’. It may not increase the crime ‘rate’ but that of course depends on your definition. Absolute numbers (e.g. N per year) are a valid definition of a rate.
    Also, without knowing why a person votes leave or remain, measurement of their wider knowledge doesn’t unambiguously validate their opinion. Does an ignorance of macroeconomic theory say, discredit an opinion influenced by other knowledge or strongly held principles. Conversely, those with the greatest knowledge of the EU are those embedded in its structures and are more likely to benefit from its spending.
    And a generic issue is the selection bias introduced by low response rates (inherent in online surveys).
    So whilst the findings in this article are interesting, it would be easy draw inaccurate conclusions about the beliefs and motivations of the respondents, or the wider population that we might presume them to represent.Report

Leave a Reply

Your email address will not be published. Required fields are marked *