Three Points Regarding the Brouhaha about Medical Schools Withdrawing from U.S. News and World Report Rankings
At last count, half of the medical schools ranked in the top 20 of the U.S. News and World Report rankings have announced that they are pulling out of the ranking, meaning that they are no longer sharing publicly available data with US News to help them compile their rankings. What follows are four random musings on the goings on.
Kudos to Harvard
Harvard Medical School, referred to only by those associated with it as HMS (and by Samuel Shem as BMS) deserves serious accolades. Formally, and probably forever ranked #1, Harvard was the first school to announce that it was no longer cooperating with the rankings. Brilliantly, they did this on the final day that schools were allowed to submit their data. Thus, Harvard not only made a point with their withdrawal but left every other school, who had already submitted their data, with only two options: remain a party to the rankings or promise to not submit data in the future.
The rankings are idiotic
I probably have nothing new to say about the absurdity of these rankings. So much has already been written. My favorite takedown was done by Malcom Gladwell on his Revisionist History Podcast. (The linked page contains references to some great further readings).
We all love lists because they simplify complexity but they are fool’s gold. Every student requires something different from a college or medical school. Some schools excel in some areas while others excel in, well, others. Students need to think about their own goals, their own way of learning, their strengths and weaknesses, and choose the school that will serve them best. Over the course of my career I have met educators from unranked medical schools that I would have loved to have worked with and learned from at my (once top 10, then top 20, now withdrawing) medical school.
These rankings have not been just silly, they have actually been harmful. Prospective students have used these them to choose medical schools even though they have, at most, a tenuous relationship with the quality of the education delivered. At an institutional level, these rankings have incentivized the wrong things for medical schools. Schools focused on ways to make themselves appear more selective or to improve their reputation as a way to try to climb in the rankings – things that don’t actually help their own students).
Probably no greater argument for the idiocy of the rankings is their own methodology, as reported on the U.S. News website.
1. Peer assessment (15% weight). This is reputation per deans, deans of academic affairs and heads of internal medicine departments or directors of admissions. This is unimportant as far as the education a student receives. It is very important in terms of keeping top ranked schools ranked at the top.
2. Assessment by residency directors (15%). Probably the only aspect I think holds much value. Meant to reflect the “quality of the students” and attractiveness to residencies. It may just speak to the “quality of the students admitted” rather than quality of the training received.
3. Federal research activity (30%) and any research activity/faculty member (10%). This kills me. These numbers have almost nothing to do with the training of a doctor. They might have a small impact on the small amount of time during which some students engage in research.
4. Median MCAT score (13%). The academic achievement of your fellow students is important, but MCAT scores are not the way to measure this and certainly have no bearing on the overall quality of the education received at a medical school.
5. Faculty resources (10%). This would be important if the resources translated into time faculty spend teaching students. Instead, it is the ratio of faculty to students. If the ratio is 7:1 but 6.9999 of those faculty don’t work with students, then this is a meaningless measure.
6. Median undergraduate GPA (6%). Ditto. It is hard to consider GPA given the school to school variability - a medical school with a lot of Swarthmore or Chicago graduates would look worse than a school with a lots of grads from equally respected schools with easier grading.
7. Acceptance rate (1%). Whatever.
If we are going to rank…
Rankings should not exist. Unfortunately, I realize that they always will. If we insist on having rankings, let’s make them transparent and let them serve the students and not the school. Here is one potential methodology.
1. 23%: Differential between mean MCAT scores and mean USMLE Step II CK scores. It’s one thing to recruit promising students and not ruin them. It’s another thing entirely to recruit average students and turn them into knowledgeable graduates.
2. 20% weight given to program directors’ rankings of schools based on experience with their own residents. Survey includes every PD in the US. This will produce different scores for schools for each specialty. Some schools are better for aspiring internists than aspiring surgeons. We could even report scores for each specialty.
3. 15%: Full time equivalents paid to faculty specifically to teach. Course directors, clerkship directors, ambulatory preceptors, physical diagnosis tutors, whatever. Teachers with time train good doctors. Including this in the rankings would incentivize schools to invest heavily in the education of their students.
4. 15%: Average number of times a school’s students have an entire patient encounter observed by a faculty member during their 4 years. You can't learn without feedback.
5. 15%: Average of NBME subject exam scores. Students need to know stuff.
6. 10%: Low debt burden. Not only should our best schools be investing in the training of their students, they should also be invested in reducing the debt burden of their students. Our students come overwhelmingly from families of wealth and too often enter high paying specialties.
7. 2%: For schools that require students to read Ending Medical Reversal and give them a subscription to Sensible Medicine.
Agreed
Hospitals should follow suit by withdrawing from the hospital ranking system. It's another list that incentivizes bad behavior, and wastes the time of administrators tasked with constantly dunning their staff to vote for their hospital.