Assessing the necessity of the standardized infection ratio for reporting central line-associated bloodstream infections

4Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

This brief article presents results that support the contention that risk adjustment via the standardized infection ratio (SIR) for the reporting of central line-associated bloodstream infections (CLABSIs) may be no more predictive than standard rate adjustments utilizing CLABSIs per central line days (i.e., CLABSI rates). Recent data posted on the U.S. Department of Health and Human Services' Hospital Compare website showed that nearly 70% of 1721 reporting hospitals with at least 1000 central line days had five or fewer infections during 2011. These hospitals had 39.3% of the total central line days and a significantly lower SIR than poorer performing hospitals with six or more CLABSIs (p<0.0001). In addition, 19 hospitals are presented which had central line days between 9000 to over 22,000 that also had zero to three CLABSIs. Some of these hospitals were university referral centers and inner city facilities. There was great variation of CLABSI cases among US hospitals. Evidence is mounting that all hospitals should be able to achieve a near zero incidence of CLABSIs and that these infections may in fact be near 'never events', which begs whether risk adjustment with the SIR is needed and whether it adds more information than does rate adjustment using CLABSI rates. © 2013 Saman, Kavanagh.

Cite

CITATION STYLE

APA

Saman, D. M., & Kavanagh, K. T. (2013). Assessing the necessity of the standardized infection ratio for reporting central line-associated bloodstream infections. PLoS ONE, 8(11). https://doi.org/10.1371/journal.pone.0079554

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free