Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-24T16:38:55.920Z Has data issue: false hasContentIssue false

Use of Administrative Data in Efficient Auditing of Hospital-Acquired Surgical Site Infections, New York State 2009–2010

Published online by Cambridge University Press:  02 January 2015

Valerie B. Haley*
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Carole Van Antwerpen
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Boldtsetseg Tserenpuntsag
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Kathleen A. Gase
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Peggy Hazamy
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Diana Doughty
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Marie Tsivitis
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York
Rachel L. Stricof
Affiliation:
New York State Department of Health, Bureau of Healthcare Associated Infections, Albany, New York Council of State and Territorial Epidemiologists, Atlanta, Georgia
*
Corning Tower Room 2580, Albany, NY 12237 ([email protected])

Abstract

Objective.

To efficiently validate the accuracy of surgical site infection (SSI) data reported to the National Healthcare Safety Network (NHSN) by New York State (NYS) hospitals.

Design.

Validation study.

Setting.

176 NYS hospitals.

Methods.

NYS Department of Health staff validated the data reported to NHSN by review of a stratified sample of medical records from each hospital. The four strata were (1) SSIs reported to NHSN; (2) records with an indication of infection from diagnosis codes in administrative data but not reported to NHSN as SSIs; (3) records with discordant procedure codes in NHSN and state data sets; (4) records not in the other three strata.

Results.

A total of 7,059 surgical charts (6% of the procedures reported by hospitals) were reviewed. In stratum 1, 7% of reported SSIs did not meet the criteria for inclusion in NHSN and were subsequently removed. In stratum 2, 24% of records indicated missed SSIs not reported to NHSN, whereas in strata 3 and 4, only 1% of records indicated missed SSIs; these SSIs were subsequently added to NHSN. Also, in stratum 3, 75% of records were not coded for the correct NHSN procedure. Errors were highest for colon data; the NYS colon SSI rate increased by 7.5% as a result of hospital audits.

Conclusions.

Audits are vital for ensuring the accuracy of hospital-acquired infection (HAI) data so that hospital HAI rates can be fairly compared. Use of administrative data increased the efficiency of identifying problems in hospitals' SSI surveillance that caused SSIs to be unreported and caused errors in denominator data.

Type
Original Article
Copyright
Copyright © The Society for Healthcare Epidemiology of America 2012 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Centers for Disease Control and Prevention. The National Healthcare Safety Network (NHSN) manual: patient safety component protocol. http://www.cdc.gov/nhsn/TOC_PSCManual.html. Accessed September 6, 2011.Google Scholar
2. New York State Department of Health. Hospital-acquired infections: New York State 2010. http://www.health.state.ny.us/statistics/facilities/hospital/hospital_acquired_infections/. Published September 2011. Accessed October 12, 2011.Google Scholar
3. Jhung, MA, Banerjee, SN. Administrative coding data and health care-associated infections. Clin Infect Dis 2009;49(6):949955.Google Scholar
4. Spolaore, P, Pellizzer, G, Fedeli, U, et al. Linkage of microbiology reports and hospital discharge diagnoses for surveillance of surgical site infections. J Hosp Infect 2005;60(4):317320.Google Scholar
5. Bolon, MK, Hooper, D, Stevenson, KB, et al. Improved surveillance for surgical site infections after orthopedic implantation procedures: extending applications for automated data. Clin Infect Dis 2009;48(9):12231229.Google Scholar
6. Inacio, MC, Paxton, EW, Chen, Y, et al. Leveraging electronic medical records for surveillance of surgical site infection in a total joint replacement population. Infect Control Hosp Epidemiol 2011;32(4):351359.Google Scholar
7. New York State Department of Health. SPARCS overview. http://www.health.ny.gov/statistics/sparcs/. Accessed September 12, 2011.Google Scholar
8. Campbell, KM. Rule your data with The Link King (a SAS/AF application for record linkage and unduplication). In: Proceedings of the 30th Annual SAS Users Group International Conference. Cary, NC: SAS Institute, 2005. Paper 020-30.Google Scholar
9. Deeks, JJ, Altman, DG. Diagnostic tests 4: likelihood ratios. BMJ 2004;329(7458): 168169.Google Scholar
10. McCoubrey, J, Reilly, J, Mullings, A, Pollack, KGJ, Johnston, F. Validation of surgical site infection surveillance data in Scotland. J Hosp Infect 2005;61(3):194200.Google Scholar
11. Manniën, J, van der Zeeuw, AE, Wille, JC, van den Hof, S. Validation of surgical site infection surveillance in The Netherlands. Infect Control Hosp Epidemiol 2007;28(1):3641.Google Scholar
12. Friedman, ND, Russo, PL, Bull, AL, Richards, MJ, Kelly, H. Validation of coronary artery bypass graft surgical site infection surveillance data from a statewide surveillance system in Australia. Infect Control Hosp Epidemiol 2007;28(7):812817.Google Scholar