Survey researchers have long speculated that there may be a link between nonresponse and measurement error - that is, people likely to become nonrespondents to a survey are also likely to make poor reporters if they do take part. Still, there is surprisingly little evidence of such a link. It could be that nonresponse is generally the product of one set of factors and reporting errors, the product of an unrelated set, or both nonresponse and reporting errors may be item-specific so that no general relationship between the two is likely to emerge. Our study examined a situation in which we thought there would be a link between response propensities and the propensity to give inaccurate answers. We asked samples of voters and nonvoters to take part in a survey that included items about voting. Past research shows that nonvoters misreport that fact and that they are less likely than voters in general to take part in surveys. We thought we could heighten the differences between voters and nonvoters in both response rates and levels of misreporting if we characterized the survey as being about politics. However, only nonresponse biases were larger when the topic of the survey was described as political, and this difference was only marginally significant. These two ways of framing the study had even smaller effects on estimates derived from other items in the questionnaire. The overall biases in estimates derived from the voting items are very substantial, and both nonresponse and measurement error contribute to them. © The Author 2010. Published by Oxford University Press on behalf of the American Association for Public Opinion Research. All rights reserved.
CITATION STYLE
Tourangeau, R., Groves, R. M., & Redline, C. D. (2010). Sensitive topics and reluctant respondents: Demonstrating a link between nonresponse bias and measurement error. Public Opinion Quarterly, 74(3), 413–432. https://doi.org/10.1093/poq/nfq004
Mendeley helps you to discover research relevant for your work.