Objective: To develop a new workplace-based EMG direct observation tool (EMG-DOT) and gather validity evidence supporting its use for assessing electrodiagnostic skills among postgraduate medical trainees.
Methods: The EMG-DOT was developed by experts using an iterative process. Validity evidence from content, response process, internal structure, relations to other variables, and consequences of testing was collected during the 2013-2014 academic year.
Results: Of 3,412 studies performed by trainees during the study period, 299 (9%) were assessed using the EMG-DOT. Of these, 203 (68%) involved a physician rater and 96 (32%) involved a technician rater. The 14-item EMG-DOT had excellent internal-consistency reliability (Cronbach α 0.94). Correlations between individual items and criterion-referenced global ratings of performance ranged from 0.36 to 0.72 (all p < 0.001). Mean total scores increased from 70% to 80% over 4 months of the EMG rotation (p < 0.001) despite a corresponding significant increase in case complexity (0.21-0.74 on a 3-point rating scale; p < 0.001). Trainees reported that the observational assessment exercise improved their knowledge or skills in 82% of encounters (188/230) and that feedback generated by the EMG-DOT improved the quality of care provided to patients in 58% (133/230). Trainees were "satisfied" or "very satisfied" with the observational assessment exercise in 96% of encounters (234/243).
Conclusions: This study provides validity evidence supporting the use of EMG-DOT scores to assess electrodiagnostic skills of residents and fellows. The EMG-DOT can be used to inform milestone-based assessments of trainee performance in neurology, child neurology, physical medicine and rehabilitation, neuromuscular, and clinical neurophysiology training programs.
© 2016 American Academy of Neurology.