Libraries at University of Nebraska-Lincoln

 

Date of this Version

Spring 8-2012

Comments

This article was submitted while the author is on a Fulbright Fellowship as a Scholar in Residence at the International Centre for Information Technology & Development, Southern University System, Baton Rouge, Louisiana, USA.

Abstract

Most computer-based assessments (CBA) employ test generators that produce multiple choice questions usually with our options. The limitations of these types of evaluation are that students can randomly select or guess answers with a 25% chance of choosing the right answer per question. The implication is that there is a one out of four probability that students can pass such examinations without understanding the contents taught in class; without studying for the examination and by just guessing answers. In the light of the foregoing, the effectiveness of multiple choice and objective questions as a tool for evaluating students’ mastery of subjects can be questioned. Unfortunately, most test generators do not have the capability for handling essay-based questions due to the fact that there are no rigid responses to essay examination questions. We attempted to bridge this gap by developing EssayTest - a tool that generates essay based questions and mark essay based examinations. Using JAVA, JDBC, MYSQL and other third party interface design tools, EssayTest employ similarity thresholds to match tokens in the answers supplied by teachers and responses from the students in an essay-based CBA as a way of scoring the examination. Preliminary tests showed very promising results.

Share

COinS