Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-30T22:05:48.987Z Has data issue: false hasContentIssue false

Communicable Disease Surveillance Systems in Disasters: Application of the Input, Process, Product, and Outcome Framework for Performance Assessment

Published online by Cambridge University Press:  02 April 2018

Javad Babaie*
Affiliation:
Health Services Management, School of Management and Medical Informatics, Tabriz University of Medical Sciences, Tabriz, East Azerbaijan, Iran Iranian Center of Excellence in Health Management, School of Management and Medical Informatics, Tabriz University of Medical Sciences, Tabriz, East Azerbaijan, Iran Tabriz Health Services Management Research Center, Tabriz University of Medical Sciences, Tabriz, East Azerbaijan, Iran
Ali Ardalan
Affiliation:
Department of Disaster Public Health, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran Department of Disaster and Emergency Health, National Institute of Health Research, Tehran University of Medical Sciences, Tehran, Maine, Iran
Hasan Vatandoost
Affiliation:
Medical Entomology and Vector Control, Tehran University of Medical Sciences, Tehran, Tehran, Iran
Mohammad Mahdi Goya
Affiliation:
Communicable Diseases Management Center, Ministry of Health, Tehran, Iran
Ali Akbarisari
Affiliation:
Health Management and Economics, Tehran University of Medical Sciences, Tehran, Tehran, Iran
*
Correspondence and reprint requests to Javad Babaie, Health services management, School of management and medical informatics, Tabriz University of Medical Sciences, Tabriz, East Azerbaijan, IR (e-mail: [email protected]).

Abstract

Objective

One of the most important measures following disasters is setting up a communicable disease surveillance system (CDSS). This study aimed to develop indicators to assess the performance of CDSSs in disasters.

Method

In this 3-phase study, firstly a qualitative study was conducted through in-depth, semistructured interviews with experts on health in disasters and emergencies, health services managers, and communicable diseases center specialists. The interviews were analyzed, and CDSS performance assessment (PA) indicators were extracted. The appropriateness of these indicators was examined through a questionnaire administered to experts and heads of communicable diseases departments of medical sciences universities. Finally, the designed indicators were weighted using the analytic hierarchy process approach and Expert Choice software.

Results

In this study, 51 indicators were designed, of which 10 were related to the input (19.61%), 17 to the process (33.33%), 13 to the product (25.49%), and 11 to the outcome (21.57%). In weighting, the maximum score was that of input (49.1), and the scores of the process, product, and outcome were 31.4, 12.7, and 6.8, respectively.

Conclusion

Through 3 different phases, PA indicators for 4 phases of a chain of results were developed. The authors believe that these PA indicators can assess the system’s performance and its achievements in response to disasters. (Disaster Med Public Health Preparedness. 2019;13:158–164)

Type
Original Research
Copyright
Copyright © Society for Disaster Medicine and Public Health, Inc. 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1. The United Nations Office for Disaster Risk Reduction (UNISDR). 2009; UNISDR terminology on disaster risk reduction. UNISDR website. http://www.unisdr.org/we/inform/publications/7817. Accessed October 18, 2014.Google Scholar
2. Ardalan, A, Mowafi, H, Khoshgsabeghe, HY. Impacts of natural hazards on primary health care facilities of Iran: a 10-year retrospective survey. PLoS Curr. 2013:5. doi: pii: ecurrents.dis.ccdbd870f5d1697e4edee5.Google Scholar
3. International Strategy for Disaster Reduction (ISDR). Hospitals safe from disasters. ISDR website. http://www.unisdr.org/2009/campaign/pdf/wdrc-2008-2009-information-kit.pdf. Accessed December 21, 2014.Google Scholar
4. Djalali, A, Hosseinijenab, V, Hasani, A, et al. A fundamental, national, disaster management plan: an education based model. Prehosp Disaster Med. 2009;24(6):565-569.Google Scholar
5. Thomas, TL, Hsu, EB, Kim, HK, et al. The incident command system in disasters: evaluation methods for a hospital-based exercise. Prehosp Disaster Med. 2005;20(1):14-23.Google Scholar
6. Myint, NW, Kaewkungwal, J, Singhasivanon, P, et al. Are there any changes in burden and management of communicable diseases in areas affected by Cyclone Nargis? Confl Health. 2011;5(1):9.Google Scholar
7. Tohma, K, Suzuki, A, Otani, K, et al. Monitoring of influenza virus in the aftermath of Great East Japan earthquake. Jpn J Infect Dis. 2012;65:542-544.Google Scholar
8. Yan, G, Mei, X. Mobile device-based reporting system for Sichuan earthquake-affected areas infectious diseases reporting in China. Biomed Environ Sci. 2012;25(6):724-729.Google Scholar
9. Schneider, MC, Tirado, Mc, Rereddy, S, et al. Natural disasters and communicable diseases in the Americas: contribution of veterinary public health. Veterinaria Italiana. 2012;48(2):193-218.Google Scholar
10. Brazilay, EJ, Schaad, N, Magloire, R, et al. Cholera surveillance during the Haiti epidemic- the first two years. N Engl J Med. 2013;368(7):599-609.Google Scholar
11. Nelli, G, Kakar, SR, Rahim Khan, M, et al. Early warning disease surveillance after a flood emergency — Pakistan, 2010. MMWR Morb Mortal Wkly Rep. 2012;61(49):1002-1007.Google Scholar
12. Topran, A, Ratard, R, Bourgeois, SS, et al. Surveillance in hurricane evacuation centers-louisiana, September-October 2005. MMWR Morb Mortal Wkly Rep. 2006;55(02):32-35.Google Scholar
13. Williams, W, Guariso, J, Guillot, K, et al. Surveillance for illness and injury after hurricane Katrina. New Orleans, Louisiana, September 8-25, 2005. MMWR Morb Mortal Wkly Rep. 2005;54(40):1018-1020.Google Scholar
14. Polonsky, J, Luquero, F, Francois, G, et al. Public health surveillance after the 2010 Haiti earthquake: the experience of Médecins Sans Frontières. PLoS Curr. 2013;7:5.Google Scholar
15. Centers for Diseases Control and Prevention. Updated guidelines for evaluating public health surveillance systems. MMWR Morb Mortal Wkly Rep. 2001;50(RR-13):1-51.Google Scholar
16. Scnall, AH, Wolkin, AF, Noe, R, et al. Evaluation of a standardized morbidity surveillance form for use during disasters caused by natural hazards. Prehosp Disaster Med. 2011;26(2):90-98.Google Scholar
17. Choudhary, E, Zane, DF, Beasley, C, et al. Evaluation of active mortality surveillance system data for monitoring hurricane-related deaths-Texas, 2008. Prehosp Disaster Med. 2012;27(4):392.Google Scholar
18. Smith, PC, Mossialos, E, Papanicolas, I. Performance measurement for health system improvement: experiences, challenges and prospects. World Health Organization Regional Office for Europe website. http://www.euro.who.int/__data/assets/pdf_file/0003/84360/E93697.pdf. Accessed April 11, 2014.Google Scholar
19. Tashobya, CK, Da Silveira, VC, Ssengooba, F, et al. Health systems performance assessment in low-income countries: learning from international experiences. Global Health. February 13 2014:10 5. doi: 10.1186/1744-8603-10-5.Google Scholar
20. Magloire, R, Mung, K, Harris, S, et al. Launching a national surveillance system after an earthquake — Haiti, 2010. MMWR Morb Mortal Wkly Rep. August. 6 2010;59(30):933-935.Google Scholar
21. Sabatinalli, G, Kakar, SR, Rahim Khan, M, et al. Early warning disease surveillance after a flood emergency – Pakistan 2010. MMWR Morb Mortal Wkly Rep. 2012;61(49):1002-1007.Google Scholar
22. Babaie, J, Ardalan, A, Vatandoost, H, et al. Performance assessment of communicable disease surveillance in disasters: a systematic review. PLoS Curr. February 24 2015; doi: 10.1371/currents.dis.c 72864d9c7ee99ffbe9ea707fe4465.Google Scholar
23. Strauss, AL, Corbin, JM. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, 2nd ed. Sage Publication Inc. 1998.Google Scholar
24. Giri, S, Nejadhashem, AP. Application of analytical hierarchy process for effective selection of agricultural best management practices. J Environ Manag. 2014;132:165-177. doi: 10.1016/j.jenvman.2013.10.021. Epub 2013 Dec 3.Google Scholar
25. Kouadio, IK, Koffi, AK, Attoh-Toure, H, et al. Outbreak of measles and rubella in refugee transit camps. Epidemiol Infect. 2009;137(11):1593-1601.Google Scholar
26. Altevogt, BM, Pope, AM, Hill, MN, et al, Research priorities in emergency preparedness and response for public health systems. The National Academies Press website. http://www.nap.edu/catalog/12136.html. Accessed April 05, 2013.Google Scholar
27. Osman, IH, Berbary, LN, Sidani, Y, et al. Data envelop analysis model for the appraisal and relative performance evaluation of nurses at an intensive care unit. J Med Syst. 2011;35:1039-1062.Google Scholar
28. Communicable Diseases Surveillance and Response Systems: A Guide to Monitoring and Evaluating. World Health Organization; 2006. http://www.who.int/csr/resources/publications/surveillance/WHO_CDS_EPR_LYO_2006_2/en/.Google Scholar
29. Connolly MA, ed. Communicable Diseases Control in Emergencies: A Field Manual. World Health Organization; 2005. http://www.who.int/diseasecontrol_emergencies/publications/9241546166/en/.Google Scholar
30. Veillard, J, Champagne, F, Klazinga, N, et al. A performance assessment framework for hospitals: the WHO regional office for Europe PATH project. Int J Qual Health Care. 2005;17(6):487-499.Google Scholar
31. Murray, CJ, Frenk, J. A framework for assessing the performance of health systems. Bull World Health Organ. 2000;78(6):717-731.Google Scholar
32. Ghana Ministry of Health. Holistic assessment of the health sector, program of woke 2012. Ghana Ministry of Health website. http://www.moh-ghana.org. Accessed July 14, 2013.Google Scholar