Opinion polls are important for measuring public opinion on political topics, including who people want to vote for, which leaders they support and what their views are on major policies.
Like other forms of data, opinion polls sometimes invoke tendentious reasoning — a partisan dismissal of what the survey could be suggesting. This kind of reaction is easily found on social media.
Tendentious reasoning: there are several standard responses to opinion polls, including claiming that surveys are too small or that they are not representative;
Survey differences: survey results can differ for legitimate reasons, such as the question wording, question order, different times, different methodologies, and sampling variability.
Deltapoll and data
As an example, I am going to look at a recent Deltapoll survey, commissioned by The Sun on Sunday, conducted between 14th and 16th August 2018. The data tables can be read online.
The survey covered multiple topics, including asking:
Do you agree or disagree with the statement ‘I have changed my mind on whether we should leave the EU or not?’
It should be noted that Robbie Gibbs’s interpretation somewhat misunderstands the question: a person could change their mind from a definite position (of either Remain or Leave) to being unsure.
The poll shows — for those recalling their vote in the 2016 EU referendum — a marginally higher rate of agreement among Remain voters (15%) than Leave voters (11%). Plausibly, this question could also be capturing a sense of ‘democratic duty’ among Remain voters to see the referendum result implemented.
However, the Number 10 Communication Director’s statement on this survey inspired numerous standard responses on Twitter:
- ‘It is an online survey or not scientific’: The implication is that this was some open access survey of Sun on Sunday readers. Whilst self-selecting surveys are indeed unrepresentative, this was a poll of an internet panel, weighted to be representative of the population.
- “Tiny sample”: Deltapoll asked 1,904 Great British adults, which is a typical sample size for a survey. To use George Gallup’s analogy, a chef only needs to sip a well-stirred soup to know its taste.
- ‘I have never heard of Deltapoll’: Whilst the company launched in March 2018, they are members of the British Polling Council and abide by its transparency rules.
- ‘No one I know has changed their mind’: Your mates are not a representative sample.
- ‘It is in The Sun’: The sponsor or newspaper that chooses to publish survey data does not invalidate social research.
- ‘Rubbish!’: Thank you for your input.
Why might surveys differ?
Recently, most opinion polls have shown similar rates of ‘switching’ among Remain and Leave voters, with Leave voters switching to Remain or being unsure at generally higher rates than those recalling a Remain vote.
Here are some good reasons as to why survey results may differ:
Question wording: The wordings could be different, prompting a different response from people answering the survey. Sometimes, the effect of wording on seemingly similar questions can be large.
Question order: It was highlighted in Yes, Prime Minister, but sometimes, the order of questions can lead people to give a different response to a survey question.
Different times: Opinions can change over time, and we should look for trends in survey results.
Different methodology: The mode of a survey can affect the results. For example, people may feel unwillingly to give a socially undesirable answer to an interviewer, but would select a radio button on an internet panel survey. Sampling methods and weightings differ by market research companies.
Sampling variability: There is an inherent cost in not questioning everyone. Even with a balanced method of getting a survey sample, not every sample will look exactly like the population.
Partisanship can affect our reasoning, and we should be wary of this tendency when making political decisions.