Making Sense of the MSPE: Understanding Neurology Program Directors’ Views on Clerkship Summaries
Stephen Powell1, Robert Thompson-Stone2, Tamara Kaplan1, Catherine Nicholas2, Samantha Myers2, Melanie Braun2, Andres Fernandez3, Christopher Mooney2
1Mass General Brigham, 2University of Rochester Medical Center, 3Thomas Jefferson University Hospital
Objective:
Explore how neurology program directors (PDs) interpret and make meaning of Medical Student Performance Evaluation (MSPE) clerkship summaries when evaluating residency applicants.
Background:
With reductions in traditional testing and grading metrics, residency program directors increasingly rely on clerkship summaries to assess applicants. Prior research highlights variability, limited transparency, and potential bias in these summaries. Little is known about neurology program directors’ perspectives on their usefulness or how departments might improve their value for residency selection.
Design/Methods:
We performed a qualitative study informed by a constructivist paradigm and theories of narrative assessment and evaluation discourse, using iterative cycles of semi-structured interviews. Four sample MSPE neurology clerkship summaries were adapted from de-identified real MSPEs. Following a pilot, interviews were conducted with 13 PDs across diverse program sizes, regions, and types (adult and child). Participants reflected on attributes of each summary, including specificity, transparency, redundancy, formatting, comparative descriptors, and keyword usage. Interviews were audio-recorded, transcribed, then analyzed using an inductive, team-based reflexive thematic analysis. Rigor was maintained through memoing, triangulation, and consensus discussions to ensure trustworthiness and credibility.
Results:
Six themes were generated. These are summarized as: (1) Greater honesty and transparency are needed but carry risk for the student, (2) Variability across institutions limits fair and reliable judgments, (3) Summaries are most useful when clearly structured, streamlined, and transparent about how evaluations are synthesized, (4) Comparative information is critical for decision-making, but prone to problematic elements, (5) Coded language is hard to interpret, creates confusion, and should be avoided, (6) Excessive laudatory comments with minimal specificity reduce usefulness and meaning.
Conclusions:
MSPE clerkship summary usefulness is limited by lack of transparency, variability and suboptimal structure, and minimal specificity with inflated and coded language. Our themes allow for the creation of specific recommendations for neurology clerkship directors, departments, and national stakeholders to guide improvement.
Disclaimer: Abstracts were not reviewed by Neurology® and do not reflect the views of Neurology® editors or staff.