Ontology highlight
ABSTRACT: Background
Reflective writing is used throughout medical education to help students navigate their transformation into medical professionals. Assessment of reflective writing, however, is challenging; each available methodology of assessment has distinct advantages and disadvantages. We tested if combining two independent assessment mechanisms-a faculty-designed rubric and Academic Writing Analytics (AWA), an automated technique-could be used together to form a more robust form of evaluation.Methods
We obtained reflective essays written by first year medical students as part of a clinical skills course. Faculty scored essays using a rubric designed to evaluate Integration, Depth, and Writing. The same essays were subjected to AWA analysis, which counted the number of reflective phrases indicative of Context, Challenge, or Change.Results
Faculty scored the essays uniformly high, indicating that most students met the standard for reflection as described by the rubric. AWA identified over 1400 instances of reflective behavior within the essays, and there was significant variability in how often different types of reflective phrases were used by individual students.Conclusions
While data from faculty assessment or AWA alone is sufficient to evaluate reflective essays, combining these methods offer a richer and more valuable understanding of the student's reflection.
SUBMITTER: Hanlon CD
PROVIDER: S-EPMC8368857 | biostudies-literature |
REPOSITORIES: biostudies-literature