Mode effects and knowledge questions
How does the survey mode affect people’s answers?
--
Back in 2011, the Royal Statistical Society commissioned Ipsos MORI to survey MPs. There was one question about statistical knowledge of independent events:
If you spin a coin twice, what is the probability of getting two heads?
In face-to-face interviews of 97 MPs, around two in five (40%) gave the correct answer.
A decade later, Savanta ComRes asked a similar question through an online survey of 101 MPs:
If you toss a fair coin twice, what is the probability of getting two heads?
This time, about one in two MPs (52%) said the right answer. This is a likely improvement from the previous survey.
There are three differences between the two surveys. The company changed, using a different wording through a distinct mode. The mode is how researchers collect answers, such as face-to-face, by phone, or online.
The pandemic halted many face-to-face surveys, forcing adaptations in mode. Ipsos MORI moved their survey of British parliamentarians to telephone interviews. In the Understanding Society surveys, researchers suspended face-to-face interviews. That panel used mixed-mode methods before, moving to only telephone and web surveys.
What influence would changing the mode have on survey responses? We need to think about how the mode of delivery can affect answers. In survey research, these differences are ‘mode effects’.
An in-person interviewer may make people feel uncomfortable with sensitive questions. ‘Social desirability bias’ encompasses giving answers to have favourable views from others. That effect tends to be strongest in in-person surveys and weakest in web surveys.
In surveys with visual scales, there can be greater selection of middle choices. Internet surveys can suffer from higher levels of satisficing. This is where people complete the survey with minimal…