Sociology, Department of

 

Document Type

Article

Date of this Version

2018

Citation

Published in Journal of Survey Statistics and Methodology (2018), 32pp.

doi 10.1093/jssam/smy005

Comments

Copyright © 2018 Jolene D. Smyth and Kristen Olson. Published by Oxford University Press on behalf of the American Association for Public Opinion Research. Used by permission.

Abstract

Several questionnaire design texts emphasize a dual role of question wording: the wording needs to express what is being measured and tell respondents how to answer. Researchers tend to focus heavily on the first of these goals, but sometimes overlook the second, resulting in question wording that does not match the response options provided (i.e., mismatches). Common examples are yes/no questions with ordinal or nominal response options, open-ended questions with closed-ended response options, and check-all-that apply questions with forced-choice response options. A slightly different type of mismatch utilizes a question stem that can be read as asking for two different types of answers with no indication of which type should be provided. In this paper, we report the results of twenty-two experimental comparisons of data quality indicators (i.e., item nonresponse and response time) and response distributions across matched and mismatched versions of questions from a postal mail survey and a telephone survey. We find that mismatched items generally have lower data quality than matched items and that substantive results differ significantly across matched and mismatched designs, especially in the telephone survey. The results suggest that researchers should be wary of mismatches and should strive for holistic design.

Supplemental data included; .docx version attached below.

Share

COinS