Off-campus UNL users: To download campus access dissertations, please use the following link to log into our proxy server with your NU ID and password. When you are done browsing please remember to return to this page and log out.

Non-UNL users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

The effects of item parameter drift in a short measure of a time-varying predictor

HyeSun Lee, University of Nebraska - Lincoln

Abstract

The purpose of the current study was to examine the impact of Item Parameter Drift (IPD) occurring in short context information scales from a large-scale assessment and determine the most appropriate way to address IPD. Focusing on the context where latent scores from a short scale were employed as a time-varying predictor, the current research investigated the impacts of IPD on the estimation of scale scores and parameter estimates in a multilevel model. Five manipulated factors including three decisions about IPD were considered. The three decisions were keeping any item exhibiting IPD, removing any item exhibiting IPD only for linking, and removing any item exhibiting IPD for the entire estimation process. Additionally, an empirical study was conducted to demonstrate how the classification of students and the parameter estimates of a time-varying predictor were different depending on the three decisions. The empirical study also illustrated how to incorporate sampling weights and plausible values into multilevel analyses. Results from the simulation study revealed that IPD occurring in a short scale affected the estimation of scores and increased misclassification rates. Substantial bias was found in parameter estimates in multilevel models. The bias in time-varying predictors was more prominent in level-2 relative to level-1. Regarding the most appropriate decision about IPD, keeping items exhibiting IPD was more appropriate than removing the items for the entire estimation process. The empirical study showed similar findings to those in the simulation study. Removing an item exhibiting IPD for the entire estimation process resulted in largely different proportions of students assigned into the lowest and highest categories compared to those produced when the item was kept. When the item was removed for the entire estimation process, the parameter estimates, especially for the level-2 time-varying predictors, were different from those under the decision of keeping the item. Considering that studies in IPD have focused on the impact of IPD on the estimation of scores in relatively long educational tests, information about the impact of IPD occurring in a short scale and the appropriate decision to address items exhibiting IPD can be useful for substantive researchers in education.

Subject Area

Educational tests & measurements|Quantitative psychology

Recommended Citation

Lee, HyeSun, "The effects of item parameter drift in a short measure of a time-varying predictor" (2016). ETD collection for University of Nebraska-Lincoln. AAI10134147.
https://digitalcommons.unl.edu/dissertations/AAI10134147

Share

COinS