Survey Says

What information does an article about polling need?

News organisations often publish articles about polling. Survey results can form an important part of major stories.

To use polls in journalism, what information do these articles need? This article seeks to answer that question, with Market Research Society guidance.

Reporting on polls

There are eight key questions that a poll-based article should seek to answer.

1. Who did the poll?

Who has undertaken the survey research? The company name is important. Journalists and readers need to work out if this is a reputable company.

In the UK, is the company:

Journalists should ensure news items include the polling company’s name. This is a basic courtesy to survey researchers. Yet, it is not always followed.

A Daily Mirror article in April 2018 stated:

More than half of the UK wants a “people’s vote” on Brexit, says new survey

This Daily Mirror article does not mention the polling company. In fact, it was an Opinium poll on behalf of Open Britain. Opinium interviewed 2,008 UK adults via its internet panel, on 10–12th April 2018. Contrary to the headline, the question did not use the phrase “people’s vote”:

Image for post
Image for post
What UK Thinks EU is an excellent resource for tracking polls. (Image: What UK Thinks EU)

2. Who sponsored the poll?

Readers should know who commissioned the survey. News organisations often sponsor regular vote intention surveys. Examples of clients include:

  • The Times sponsoring a weekly YouGov poll;
  • The Observer sponsoring Opinium surveys.

Survey research is a dusty mirror on society. Articles should recognise who paid for this public good.

The research company may self-fund opinion polls. No-one commissioned YouGov to ask which ice creams are an “ice lolly”:

Image for post
Image for post
There was a near-even split on the categorisation of Feasts. (Image: YouGov)

Nor did anyone commission YouGov to ask Brits about chip toppings.

Social research is about more than vote intention polls or what newspapers might pay for. Finding answers to questions about ourselves can be fun too. If the research company covered the costs, that needs recognition.

The client should not influence the results. Transparency is important: we should know if survey results interest the client.

3. Who did they ask?

Opinion polls often ask around 1,000–2,000 adults in Great Britain or the United Kingdom. Adults means residents aged 18 or over. The decision whether to include Northern Ireland is important.

Bigger polls are not always better. What matters is how the company gathered their sample. Huge sample sizes may sound impressive, but often go with poor methods. Self-selecting surveys — such as on Twitter — should not be mistaken for reliable measures.

Image for post
Image for post
In general, larger surveys only reduce sampling error. (Image: Pew Research Center)

Some polls have different target populations. These populations may be in certain regions, age groups, or occupations. A vote intention poll of adults in Scotland would be very different to London or South West England.

Readers should know who the polling company asked in their survey.

4. How did the company conduct its research?

There are two main ways for British opinion polls to find respondents:

  • Telephone: computers enter random digits. Researchers call people on landlines or mobile phones.
  • Internet panels: people join a panel to answer surveys. Researchers contact people on this panel.

Some companies run random probability samples with face-to-face interviews. ‘Internet river’ sampling — where ads intercept people online — is another method.

The survey mode matters: it can affect how people respond to questions. A person may not disclose embarrassing facts about themselves in a phone call. There were differences between phone and internet polls in the EU referendum.

Image for post
Image for post
Internet panel polls suggested it was a close contest. (Image: Prof Patrick Sturgis/NCRM)

Survey companies will say their sample is ‘representative of the population’. That means they ‘weighted’ their results to match the population profile.

Say you had a poll with 1,100 men and 900 women. In the UK adult population, there are more women than men. That poll would ‘weigh’ the results, so it had the ‘right’ number of men and women. Answers from women would count for a little more, and men a little less.

Readers should know how (such as by age or gender) the company weighted their responses.

5. When did the company survey their respondents?

Major events can affect public opinion. There is a tendency to believe every event will turn the tides of public attitudes.

Partisan reasoning means people interpret news to support current positions. People may not even notice an event, watching a limited amount of news.

Image for post
Image for post
Populus ask open questions each weekly about what news story people noticed most. (Image: Populus)

There is direction from political leaders (“elite cues”), parties, and citizens (“social cues”). In Britain, there has been waning identification with political parties. As a result, public opinion is often more volatile — open to ‘electoral shocks’.

Readers should know when the research company conducts their surveys.

6. What were the questions? What options could people choose?

Question wordings and response options affect survey estimates.

Minor changes in wording can produce notable differences.

Image for post
Image for post
‘Giving’ is more positive than ‘reducing’. (Image: YouGov)

Often, distinct wordings represent conceptual differences. Asking people if they want a referendum on accepting a UK-EU trade deal is different to one on EU membership.

There are different response formats too. Researchers could ask people to give an answer from a list. Such closed questions should aim to have an exhaustive and exclusive list of options. There are open questions too, where people can respond in their own words.

Asking questions in an agree-disagree format may lead to acquiescence bias. Some people are just agreeable. Some people want to get through the survey.

How can researchers construct good question wordings? (Video: Pew Research Center)

Question order can also influence how people respond. This is why vote intention questions are often first: to give the ‘cleanest’ estimate. Writing surveys is a challenging task.

7. What are the plausible ranges around survey estimates?

Survey statistics are estimates of actual public opinion. Instead of asking everyone, researchers ask a sample. That sample should be similar to the whole population. It is like a cake: researchers aim to take the perfect slice.

There is a cost to not asking everyone: the results may be somewhat different to actual public opinion. That uncertainty is inherent to survey research.

As such, we may see differences between surveys that are due to the varying sample. These small differences mean we lack confidence there was a real change in public opinion.

Image for post
Image for post
We only see one sample. (GIF: Wild/University of Auckland)

A lot can go wrong with surveys. Even in ideal conditions, there is sampling error — the cost of not asking everyone. The total survey error framework categorises other types of error:

  1. Specification error (validity): the survey is not measuring what the researcher was intending.
  2. Coverage error (frame error): units that should be in the frame are either copied or missing.
  3. Non-response error: if units not answering the survey differ from those that do, we get an error.
  4. Measurement error: how researchers did the survey affected recorded values. For example: people being socially acceptable to interviewers.
  5. Processing error: after collection, researchers may make mistakes. These errors could be incorrect imputations, codes or weights.

Journalists should reiterate what plausible ranges surround survey estimates.

8. Where can I find the data tables?

Members of the British Polling Council produce data tables (or computer tables) for each published survey. The British Polling Council has a list of member companies, with links to their archives or sites.

These tables show: question order, question wording, and response options. The survey statistics are by political and demographic groups. There may be methods sections included within the report too.

Image for post
Image for post
You can see the sample size (1,606) and target population (“adults in GB”) too. (Image: YouGov)

Once the data tables are available, journalists should link to these tables. Before then, journalists can link to the general archive for that company. It is helpful for interested readers to know what other questions there were.

Understanding public opinion is important for news reports. Journalists can expand on this list: including general trends and other survey results.

This blog looks at the use of statistics in Britain and beyond. It is written by RSS Statistical Ambassador and Chartered Statistician @anthonybmasters.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store