For the last 10 years, U.S. News & World Report has published an annual ranking of the top American hospitals in 16 specialties and 20+ metro areas. In the 2010 survey, only 152 of the 4,852 hospitals that were evaluated performed well enough to rank in any specialty. Of the 152, fourteen of them were given ‘honor role’ status for being ranked near the top in at least six specialties.
The rankings are based on a weighted combination of several metrics: reputation (32.5%), a mortality index (32.5%), a patient safety index (5%) and other factors including nurse staffing and technology (30%). While I know rankings sell magazines, I’m surprised that reputation – which is based on the survey results of 9,000 doctors – is weighted so highly. It makes the ranking feel more like a popularity contest than a scientific result.
An article recently published in the Annals of Internal Medicine seems to confim my intuition. The author concluded that the relative rankings of the top hospitals was largely based on their subjective reputations. The report found that:
… the rankings based on reputation score alone agreed with U.S. News & World Report’s overall rankings 100% of the time for the top hospital in each specialty, 97% for the top 5 hospitals, 91% for the top 10 hospitals, and 89% for the top 20 hospitals.
On the other hand, the objective measures of hospital quality, like mortality and patient safety, didn’t correlate with high ranking. The highest ranked hospitals weren’t the ones with the desirable medical outcomes but rather the ones that were most popular with other doctors.
As a patient, you could get good medical care at the top ranked hospitals. But you might not. Your local hospital may not be ranked but it still might be very good. And since it’s less popular, you’re likely to get faster treatment.