Self-Selection and Social Media Surveys
Social media platforms, such as Facebook and Twitter, may offer users the ability to run surveys.
Open-access surveys are not proper measures of public opinion.
In short
Self-selection bias: When respondents are able to entirely decide whether to participate in a survey, the sample may not be representative.
Open-access surveys have no controls, no weightings and no credibility.
Question wording: Those writing these surveys may not design balanced questions, leading to further bias in the survey results.
Voodoo Hoodoo
An open-access survey is a type of survey where the respondents self-select into participation. Common types of open-access surveys include Facebook surveys, Twitter polls, clickable questions on websites, as well as mail-in, phone-in and text-in surveys.
The term ‘voodoo poll’ was coined by Sir Robert Worcester, founder of the polling company MORI. These open-access surveys are open to manipulation. People can phone or text in multiple times. Bots can be used to answer online surveys.
People may share these open-access surveys with their friends and followers. The refrain to share ‘for a larger sample size’ can lead to greater bias, not less.
These samples are comprised solely of those who see the open-access survey, and are animated enough by the issue to participate.
The British Future and HOPE not Hate report, ‘National Conversation on Immigration’, contained an excellent demonstration.
People were asked to rate — on a scale of 1 to 10 — the impacts of immigration on the UK, including their local community. The question was posed to both an online open survey, and a representative sample polled by ICM Unlimited:
The extremely negative (1) and extremely positive (10) responses were overly abundant in the online open-access survey, with intermediate answers under-represented.
The larger, open sample gave an inverted impression of national opinion.
Scientific polling methods randomly select people from the population or some proxy group (the ‘sampling frame’). A typical sampling frame used in modern polling is an internet panel. Statistical weighting is then used to make the sample reflect the target population.
By contrast, open-access surveys have no controls, no weightings and no credibility.
No inferences can be made about the general population from these open surveys.
Balanced Wording
Beyond self-selection bias, there is another reason why these surveys may diverge from truly representing public opinion.
The authors may not design balanced questions, and lead the self-selecting participants. I present two recent examples from Facebook.
‘Moggmentum’ gave an imbalanced question, as people were offered to support the proposal, but not oppose, in the question wording. Compounding their error, the author depicted the possible responses with smiling and sad photos of Jacob Rees-Mogg MP (Conservative, North East Somerset):
On Twitter, Charlie Elphicke MP (Conservative, Dover) shared a “10,000 sample open poll” from Facebook, claiming an “astonishing 97%” supported a ‘free trade deal’ over the ‘Chequers proposals’.
To borrow a friend’s quip, this is a representative’s sample, not a representative sample.
It should be unsurprising, not astonishing, that a Leave-supporting Conservative MP’s Facebook survey — with an imbalanced question — heavily leans towards one response.
It may be fun to ask people on Twitter what you should eat for dinner, or how to style your hair.
It is less fun when people cite open-access surveys as a misrepresentation of public opinion.