Uniform Methods Urged for Grading Hospital Report Cards
Posted Nov 09 2010 9:00am
Inconsistent surveillance practices threaten validity of rankings, researchers say
TUESDAY, Nov. 9 (HealthDay News) -- Differences in hospital surveillance methods affect the quality of public reporting of bloodstream infections on hospital report cards, researchers have found.
"Public reporting of hospital-specific infection rates is widely promoted as a means to improve patient safety. Central line [central venous catheter]-associated bloodstream infection [BSI] rates are considered a key patient safety measure because such infections are frequent, lead to poor patient outcomes, are costly to the medical system and are preventable," Dr. Michael Y. Lin, of Rush University Medical Center in Chicago, and colleagues wrote in the Nov. 10 issue of the Journal of the American Medical Association.
"Publishing infection rates on hospital report cards, which is increasingly required by regulatory agencies, is intended to facilitate interhospital comparisons that inform health care consumers and provide incentive for hospitals to prevent infections. Interhospital comparisons of infection rates, however, are valid only if the methods of surveillance are uniform and reliable across institutions," the researchers explained.
In the study, Lin's team assessed central line-associated BSI rates in 20 intensive care units at four medical centers between 2004 and 2007. During that time, hospital infection control specialists conducted routine surveillance using U.S. Centers for Disease Control and Prevention definitions. The researchers later calculated infection rates with a computer algorithm that used the same CDC surveillance definitions.
According to infection control specialists' surveillance, the median rate of central line-associated BSI was 3.3 infections per 1,000 central line-days; however, the median rate determined by the computer algorithm was nine infections per 1,000 central line-days.
Further analysis revealed a weak overall correlation between rates determined by computer algorithm and infection control specialists. The researchers also found major differences among medical centers in the relationship between computer algorithm and infection control specialist rates.
"The medical center that had the lowest rate by traditional surveillance [2.4 infections per 1,000 central line-days] had the highest rate by computer algorithm [12.6 infections per 1,000 central line-days]," the study authors wrote.
"In this study, we found strong evidence of institutional variation in central line-associated BSI surveillance performance among medical centers. Inconsistent surveillance practice can have a significant effect on the relative ranking of hospitals, which threatens the validity of the metric used by both funding agencies and the public to compare hospitals," they concluded in the report. "As central line-associated BSI rates gain visibility and importance . . . we should seek and test surveillance measures that are as reliable and objective as possible."
The U.S. Department of Health and Human Services has a Web site for those seeking to compare hospitals .
(SOURCE: Journal of the American Medical Association, news release, Nov. 9, 2010)