Sociology, Department of
The Effects of Mismatches Between Survey Question Stems and Response Options on Data Quality and Responses
Date of this Version
Published in Journal of Survey Statistics and Methodology (2018), 32pp.
Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.
Supplemental data included; .docx version attached below.
Family, Life Course, and Society Commons, Quantitative, Qualitative, Comparative, and Historical Methodologies Commons, Social Psychology and Interaction Commons, Social Statistics Commons
Copyright © 2018 Jolene D. Smyth and Kristen Olson. Published by Oxford University Press on behalf of the American Association for Public Opinion Research. Used by permission.