Sky News and Social Media Surveys
Sky News used a Twitter ‘poll’ in their reporting, as ITV continue.
On the Bank Holiday, writer Matt Chorley (Times Radio) started some surveys on Twitter. The results of this social media survey were the subject of a Sky News online article. Later, a Sky News journalist said this Twitter survey was “a poll by Times Radio”.
This article discusses self-selection bias and reporting guidance. Broadcasters should follow Market Research Society guidance: do not report on self-selecting surveys.
Clutching at straw polls
On social media, users may start surveys for other accounts to vote. There are many problems with these surveys. Social media surveys are not representative of the public, or even the platform’s users.
There are no limits other than one vote per account. The same person can respond many times with many accounts. There is nothing to stop people outside the population from voting. Anyone with an account can vote. Sir Robert Worcester coined the term ‘voodoo poll’ for open access surveys.
People on social media are not representative of the population. Social media users tend to be younger, with higher education levels.
Voluntary responses cause unknown error. Those voting are those who see the survey and animated enough to take part. As people volunteer their answers, we get self-selection bias.
People choose whether to take part. In self-selecting surveys, extreme responses are often over-represented. There are many examples of this phenomenon, including:
- ‘Brexitometer’ boards in Wales: a massive over-representation for remaining in the European Union. In July 2019, YouGov polling suggested public opinion in Wales had more balance.
- HOPE not Hate: the campaign groups ran an open survey about perception of immigration. They also paid for ICM Unlimited to run a representative opinion poll too. Extreme responses (1 and 10 on the scale) were over-represented in the open survey.
- World of Warcraft: a 2014 study of players looked at a self-selecting sample. That survey of French-speaking players compared against a (perfect) random sample:
The self-selected samples appear to be more involved in the game than the random sample avatars.
A large number of self-selecting respondents does not fix these problems. People share these surveys with followers and friends, who have similar views. ‘Sharing for a bigger sample’ can increase self-selection bias.
Matt Chorley started a ‘World Cup’ of “Best PMs we never had”. In the final self-selecting survey, the former leader Jeremy Corbyn MP (Labour, Islington North) won.
Alan McGuinness (Sky News) then wrote an article about this Twitter survey:
But the unscientific poll results show an exact opposite outcome to a recent survey by official polling company YouGov, which put Mr Corbyn bottom of the list — with a net score of -53%.
If it is an “unscientific” survey, then a news organisation should not report on its result. This is regardless of topic or result.
Kay Burley (Sky News) then brought up this Twitter survey live on air. The discussion was with shadow minister Toby Perkins MP (Labour, Chesterfield):
KB: This latest poll though that Jeremy Corbyn is the best Prime Minister we never had. Would you agree with that?
TP: It’s not a poll. I wouldn’t, I must admit. I voted for a couple of others ahead of him. That’s not a poll. It’s a Twitter…
KB: What is it then? Just because you don’t like the result doesn’t mean it’s not a poll.
TP: It wouldn’t have any legitimacy in opinion poll terms. People did have an opportunity to make Jeremy Corbyn a few months ago, and they chose against it.
KB: Are you dissing this Times Radio poll?
TP: That’s a bit of fun.
KB: I don’t think they think it’s a bit of fun: saying that he’s the best Prime Minister we’ve never had here in the United Kingdom.
TP: There was a large number of people who went onto Twitter. Twitter is not public opinion.
KB: It certainly is.
Self-selecting surveys have no controls, no weights, and no credibility. They are not reliable measures of public opinion. Journalists should not speak about these surveys as if they represent public opinion.
Good Morning, confusion
The Good Morning Britain account started a Twitter survey about the use of face masks in schools:
YouGov ran an internet panel poll of 2,215 parents in England and Wales (aged 18 or over). The company collected responses on 27–28th August 2020. The question was:
Do you think that secondary school children should or should not have to wear face masks upon their return to school?
In the YouGov sample, 57% said “they should”, and 27% said “they should not”. The other 16% of parents did not know. This is a survey, so there is a range around each figure: it could be somewhat higher or lower.
This is another example of self-selection bias at work. People without children could take part in the Twitter survey. Anyone with a Twitter account could take part. Animated people on the issue means self-selecting surveys get lopsided and inaccurate results. The question was different too, with different populations (parents).
Good Morning Britain included the Twitter survey in their broadcast. That led to confusion when GMB referred to the YouGov poll.
Let the MRS be your guide
With IMPRESS, the Market Research Society wrote a guide for journalists.
It is crystalline on the matter of self-selecting surveys [edited for a missing word]:
A questionnaire/voodoo poll/ straw poll/online vote/text vote all try to obtain a view by contacting as many people as possible to answer questions. The sample will always be self-selecting, and the numbers will have no statistical significance.
Reporters may find this a valuable and useful editorial tool producing good anecdotal material or provides a different [view] but the limitations of this type of information gathering must be clear and the results should not be included in news reporting.
To reiterate: news reporting should not include results from self-selecting surveys.