Ontology highlight
ABSTRACT: Background
In surgical training, assessment tools based on strong validity evidence allow for standardized evaluation despite changing external circumstances. At a large academic institution, surgical interns undergo a multimodal curriculum for central line placement that uses a 31-item binary assessment at the start of each academic year. This study evaluated this practice within increased in-person learning restrictions. We hypothesized that external constraints would not affect resident performance nor assessment due to a robust curriculum and assessment checklist.Methods
From 2018 to 2020, 81 residents completed central line training and assessment. In 2020, this curriculum was modified to conform to in-person restrictions and social distancing guidelines. Resident score reports were analyzed using multivariate analyses to compare performance, objective scoring parameters, and subjective assessments among "precoronavirus disease" years (2018 and 2019) and 2020.Results
There were no significant differences in average scores or objective pass rates over 3 years. Significant differences between 2020 and precoronavirus disease years occurred in subjective pass rates and in first-time success for 4 checklist items: patient positioning, draping, sterile ultrasound probe cover placement, and needle positioning before venipuncture.Conclusion
Modifications to procedural training within current restrictions did not adversely affect residents' overall performance. However, our data suggest that in 2020, expert trainers may not have ensured learner acquisition of automated procedural steps. Additionally, although 2020 raters could have been influenced by logistical barriers leading to more lenient grading, the assessment tool ensured training and assessment integrity.
SUBMITTER: Schmiederer IS
PROVIDER: S-EPMC8276111 | biostudies-literature |
REPOSITORIES: biostudies-literature