Education and Human Sciences, College of (CEHS)


Date of this Version



Suarez, S. M. (2016). The effects of missing data treatment on person ability estimates using IRT models.


A THESIS Presented to the Faculty of The Graduate College at the University of Nebraska In Partial Fulfillment of Requirements For the Degree of Master of Arts, Major: Educational Psychology, Under the Supervision of Professor Rafael De Ayala. Lincoln, Nebraska: August, 2016

Copyright (c) 2016 Sonia Mariel Suarez Enciso


Unplanned missing responses are common to surveys and tests including large scale assessments. There has been an ongoing debate on how missing responses should be handled and some approaches are preferred over others, especially in the context of the item response theory (IRT) models. In this context, examinees’ abilities are normally estimated with the missing responses generally ignored or treated as incorrect. Most of the studies that have explored the performance of missing data handling approaches have used simulated data. This study uses the SERCE (UNESCO, 2006) dataset and missingness pattern to evaluate the performance of three approaches: treating omitted as incorrect, midpoint imputation, and multiple imputation with and without auxiliary variables. Using the Rasch and 2PL models, the results showed that treating omitted as incorrect had a reduced average error in the estimation of ability but tended to underestimate the examinee’s ability. Multiple imputation with and without auxiliary variables had similar performances to one another. Consequently, the use of auxiliary variable may not harm the estimation, but it can become an unnecessary burden during the imputation process. The midpoint imputation did not differ much from multiple imputation in its performance and thus should be preferred over the latter for practical reasons. The main implication is that SERCE might have underestimated the student’s ability. Limitations and further directions are discussed.

Adviser: R. J. De Ayala