Ontology highlight
ABSTRACT: Background
This study investigated the impact of addressing item writing flaws, testing at low cognitive level and non-functioning distractors (< 5 % selection frequency) in multiple-choice assessment in preclinical medical education.Method
Multiple-choice questions with too high or too low difficulty (difficulty index < 0.4 or > 0.8) and insufficient discriminatory ability (point-biserial correlation < 0.2) on previous administration were identified. Items in Experimental Subgroup A underwent removal of item writing flaws along with enhancement of tested cognitive level (21 multiple-choice questions), while Experimental Subgroup B underwent replacement or removal of non-functioning distractors (11 multiple-choice questions). A control group of items (Group C) did not undergo any intervention (23 multiple-choice questions).Result
Post-intervention, the average number of functioning distractors (≥ 5 % selection frequency) per multiple-choice question increased from 0.67 to 0.81 in Subgroup A and from 0.91 to 1.09 in Subgroup B; a statistically significant increase in the number of multiple-choice questions with sufficient point-biserial correlation was also noted. No significant changes were noted in psychometric characteristics of the control group of items.Conclusion
Correction of item flaws, removal or replacement of non-functioning distractors, and enhancement of tested cognitive level positively impact the discriminatory ability of multiple-choice questions. This helps prevent construct-irrelevant variance from affecting the evidence of validity of scores obtained in multiple-choice questions.
SUBMITTER: Ali SH
PROVIDER: S-EPMC4602009 | biostudies-literature |
REPOSITORIES: biostudies-literature