Hostname: page-component-586b7cd67f-g8jcs Total loading time: 0 Render date: 2024-11-29T04:43:39.206Z Has data issue: false hasContentIssue false

The validity of validation: A practical assessment

Published online by Cambridge University Press:  24 January 2020

Eric C. Stone
Affiliation:
Division of Infectious Diseases, Mayo Clinic, Rochester, Minnesota
Vickie Miller
Affiliation:
Division of Infectious Diseases, Mayo Clinic, Rochester, Minnesota
Heidi J. Shedenhelm
Affiliation:
Department of Nursing, Mayo Clinic, Rochester, Minnesota
Walter C. Hellinger
Affiliation:
Division of Infectious Diseases, Mayo Clinic, Jacksonville, Florida
John C. O’Horo*
Affiliation:
Division of Infectious Diseases, Mayo Clinic, Rochester, Minnesota
*
Author for correspondence: John C. O’Horo, MD, MPH, E-mail: [email protected]

Abstract

Objective:

To assess the time to achieve reliable reporting of electronic health record data compared with manual reporting during validation.

Design:

Secondary analysis of aggregate data for number of patients present, number of patients with a central venous catheter, and number of patients with an indwelling urinary catheter during validation of an electronic health record reporting tool.

Setting:

Mayo Clinic Health System in Wisconsin.

Participants:

Mayo Clinic infection prevention and control staff, unit champions, and all inpatients.

Methods:

We simultaneously collected electronic and manual counts of device data and compared discrepancies to determine their source. If manual data entry was incorrect, manual counts were coded as inaccurate. If electronically abstracted data did not reflect an accurate count, errors were attributed to the system. Data were compared using standard statistical methods.

Results:

Within 30 days after beginning validation of electronic reporting for central venous catheter days and urinary catheter days, electronic counts were durably more reliable than manual counts.

Conclusions:

Manual validation for capturing and reporting electronic data and reporting can be shorter than the 90 days currently mandated by National Healthcare Safety Network criteria. Compared with a longer validation period, a shorter validation period may yield substantial savings while achieving the same validity.

Type
Original Article
Copyright
© 2020 by The Society for Healthcare Epidemiology of America. All rights reserved

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

National Healthcare Safety Network (NHSN) Patient Safety Component Manual 2018. Centers for Disease Control and Prevention website. https://www.cdc.gov/nhsn/pdfs/validation/2018/pcsmanual_2018-508.pdf. Published 2018. Accessed July 24, 2019.Google Scholar
NHSN 2019 Guidance and Toolkit for Data Quality Checks for Reporting Facilities: 2019 Internal Validation Guidance. Centers for Disease Control and Prevention website. https://apps.who.int/iris/handle/10665/279788. Published 2019. Accessed January 9, 2020.Google Scholar
Masnick, M, Morgan, DJ, Wright, M-O, Lin, MY, Pineles, L, Harris, AD.Survey of infection prevention informatics use and practitioner satisfaction in US hospitals. Infect Control Hosp Epidemiol 2014;35(7):891893.CrossRefGoogle ScholarPubMed
McHugh, ML.Interrater reliability: the kappa statistic. Biochem Med (Zagreb) 2012;22:276282.CrossRefGoogle ScholarPubMed
Gliklich, RE, Khuri, SF, Mantick, N, Mirrer, B, Sheth, N, Terdiman, J.Data collection and quality assurance. In: Gliklich, REDreyer, NA, Leavy, MB, ed. Registries for Evaluating Patient Outcomes: A User’s Guide, 3rd edition. Rockville, MD: Agency for Healthcare Research and Quality; 2014.Google Scholar
Newgard, CD, Zive, D, Jui, J, Weathers, C, Daya, M.Electronic versus manual data processing: evaluating the use of electronic health records in out-of-hospital clinical research. Acad Emerg Med 2012;19:217227.CrossRefGoogle ScholarPubMed
Nahm, ML, Pieper, CF, Cunningham, MM.Quantifying data quality for clinical trials using electronic data capture. PLoS One 2008;3:e3049.CrossRefGoogle ScholarPubMed
Singh, B, Singh, A, Ahmed, A, et al.Derivation and validation of automated electronic search strategies to extract Charlson comorbidities from electronic medical records. Mayo Clin Proc 2012;87:817824.CrossRefGoogle ScholarPubMed
Edwards, JR, Pollock, DA, Kupronis, BA, et al.Making use of electronic data: the National Healthcare Safety Network eSurveillance Initiative. Am J Infect Control 2008;36:S21S26.CrossRefGoogle ScholarPubMed
Occupational employment and wages, May 2018: 29-1141 registered nurses. US Bureau of Labor Statistics website. https://www.bls.gov/oes/current/oes291141.htm. Published 2019. Accessed July 24, 2019.Google Scholar
Supplementary material: File

Stone et al. supplementary material

Stone et al. supplementary material

Download Stone et al. supplementary material(File)
File 29.7 KB