Sociology, Department of

 

Date of this Version

2010

Comments

Published in Field Methods 22:4 (2010), pp. 295–318; doi: 10.1177/1525822X10379795 Copyright © 2010 Kristen Olson; published by Sage Publications.
Used by permission. http://fmx.sagepub.com/content/22/4/295

Abstract

Expert reviews are frequently used as a questionnaire evaluation method but have received little empirical attention. Questions from two surveys are evaluated by six expert reviewers using a standardized evaluation form. Each of the questions has validation data available from records. Large inconsistencies in ratings across the six experts are found. Despite the lack of reliability, the average expert ratings successfully identify questions that had higher item nonresponse rates and higher levels of inaccurate reporting. This article provides empirical evidence that experts are able to discern questions that manifest data quality problems, even if individual experts vary in what they rate as being problematic. Compared to a publicly available computerized question evaluation tool, ratings by the human experts positively predict questions with data quality problems, whereas the computerized tool varies in success in identifying these questions. These results indicate that expert reviews have value in identifying question problems that result in lower survey data quality. 29

Included in

Sociology Commons

Share

COinS