6 minute read

Rethinking how students are evaluated for residency

Next Article
Inside the DFI

Inside the DFI

Rethinking how medical students are evaluated for residency

BY SARAH RICHARDS

Advertisement

“I am a Black student taking the USMLE

Step 1 exam on Thursday, June 11. I have not had time to grieve, I have not had time to feel, and I have not had time to hurt—because in order for me to pass this exam I am required to be hyper focused and undistracted. A luxury that

I do not have.”

Jasmine Solala, MA, MS3 University of Illinois College of Medicine To develop the most skilled, innovative, adaptable and compassionate physicians, leaders in academic medicine continuously review and adjust the methods used to teach, train and evaluate medical students and residents.

The civil rights protests last summer, along with the stark health disparities exposed by the COVID-19 pandemic, added urgency to this practice, with many students and faculty stressing the need for medical education to better address structural racism.

Several University of Chicago Pritzker School of Medicine faculty members have emerged as thought leaders in this environment. Their suggestions for improving medical student evaluation have appeared in leading journals, and their actions are placing the University at the forefront of modernizing medical education.

“The University of Chicago Medicine is a well-known medical training institution, and we can be an example for others to follow,” said Alisa McQueen, MD, Associate Chair for Education and Program Director of the pediatric residency program and the fellowship in pediatric emergency medicine program.

Embracing the move to pass/fail for Step 1

In a letter titled “Why We Can’t Wait,” published in the Journal of Graduate Medical Education in February, 11 UChicago Medicine residency program directors announced they would no longer use U.S. Medical Licensing Examination (USMLE) Step 1 scores when selecting medical students for their residencies. Beginning with the 2020 recruitment season, Step 1 numerical scores will be masked when programs select applicants for interviews or rank applicants for the match.

The Federation of State Medical Boards (FSMB) and the NBME (National Board of Medical Examiners) decided last year to replace numerical scores with a pass/ fail ranking to reduce the overemphasis on students’ performance on the test. The change doesn’t go into effect until 2022, however—making UChicago Medicine one of the first institutions in the U.S. to adopt it.

“There is no evidence that being able to score well on a standardized exam makes you a better physician, researcher or educator, and for many equity-seeking groups, standardized exams and particularly board exams have been an obstacle for reaching certain specialized fields in medicine,” said Monica Vela, MD’93, Associate Dean for Multicultural Affairs.

The authors of “Why We Can’t Wait” include Jasmine Solala, MS3, a University of Illinois College of Medicine student who wrote an essay about her experiences studying for Step 1 during the George Floyd protests; Victoria Okuneye, PhD’20, MS2, who brought the essay to the attention of Pritzker leadership; Rochelle Naylor, MD, Associate Program Director of the pediatric residency program; McQueen; and Vela.

The letter urges other institutions to immediately adopt Step 1 pass/fail grading in light of the structural inequities emphasized by the pandemic and Black Lives Matter protests. The authors contend that embracing this change early recognizes historic discrimination and injustices in medicine suffered by Blacks, Latinos, women and people of non-conforming genders and sexualities. At least some of

“The debate about changing the scoring system for the Step 1 exam is both a manifestation of dysfunction in the medical education pathway and an opportunity to address this dysfunction.”

From “A Shared Evaluation Platform for Medical Training” New England Journal of Medicine

the benefits of increasing diversity among physicians will be improved patient health outcomes and more equitable practices in medicine and medical education.

Step 1 was originally designed to be pass/ fail. But as medical schools switched to pass/fail grading, residency programs have used Step 1 as a tool to screen applicants. Medical students increasingly skip class and other learning opportunities to devote hundreds of hours to studying for the highstakes exam.

“Step 1 scores were never meant to be used to count somebody out before knowing anything else about their skills, abilities and contributions to the medical field,” Naylor said.

Pass/fall for Step 1 isn’t the only recent shake-up in the medical licensure process: In January, the FSMB and NBME announced that the Step 2 Clinical Skills exam would be dropped permanently after the in-person exams were shelved last year due to the pandemic. The two sponsors said they would work to establish new methods for assessing students’ clinical skills.

At UChicago Medicine, the residency programs that pledged not to consider Step 1 numerical scores will review students’ applications more holistically. In addition to the usual letters of recommendation, personal essay and medical school transcripts, applications are also being weighted for their service—for example, working in a community health clinic— leadership capacity, and work advancing the field of medicine.

Measuring performance in medical school and beyond

James Woodruff, MD, Pritzker’s Dean for Students and Associate Program Director for the internal medicine residency program, said the recent changes highlight a larger issue: the need to improve the transition between medical school and graduate medical education.

In an article published in February in The New England Journal of Medicine, Woodruff and John McConville, MD, Director of the internal medicine residency program, wrote that a shared framework is needed to measure students’ performance throughout medical school, residency and fellowships, as is a national data system to track the information.

The article describes some weaknesses of the current system. For example, residency applications focus overwhelmingly on a student’s positive qualities and miss relevant information that would support their move to graduate medical education, including comprehensive details about their performance during patient care experiences.

“Historically we’ve used reductionist assessment methods characteristic of the natural sciences,” said Woodruff. “The problem is we’re functioning as educators and physicians in areas with a wide range of complexity, from basic science all the way through sociology to philosophy.”

Even though grades from students’ preclinical classes and clinical rotations are included in residency applications, they do not predict a student’s performance during residency, Woodruff said. Doctoring requires judgment and continuous adaptability, and the abilities to prioritize competing tasks, integrate into large teams, respond to feedback and communicate with others. Methods currently used by medical school to assess these skills still don’t fully value or capture their complex nuances.

To address these flaws, Woodruff and McConville suggest requiring all students complete at least one sub-internship and expanding the Accreditation Council for Graduate Medical Education’s Milestones resident assessment program so that universal specialty and subspecialtyspecific Milestones-based assessments can be used to evaluate students during clinical rotations. The information could then be entered into a national system of medical school, residency and fellowship performance data, providing educators with a longitudinal view of an individual’s performance.

Woodruff said expanding Milestones would also create a shared medical education assessment language and that the data could be used by medical schools and residency programs to see what is and isn’t effective in teaching programs.

“We’ll be able to make informed decisions about what works instead of relying on surrogate, short-term markers of dubious value when it comes to long-term outcomes,” said Woodruff, who added that the ACGME has expressed interest in a pilot program involving Milestones assessments and a handful of medical schools. “If such a plan comes to be, we’ll be one step closer to truly evidence- and outcomes-based training in medical school.”

“We urge our fellow program directors to join us in this effort. Our patients and our nation’s health care workforce cannot afford to wait. Justice cannot wait.”

From “Why We Can’t Wait” Journal of Graduate Medical Education

This article is from: