Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-24T03:59:04.319Z Has data issue: false hasContentIssue false

A Framework to Assess the Quality of Non-traditional Articles in the Field of Disaster Response and Management

Published online by Cambridge University Press:  05 July 2018

Mary Hall*
Affiliation:
School of Health and Related Research, University of Sheffield, UK
Chris Cartwright
Affiliation:
School of Health and Related Research, University of Sheffield, UK
Andrew C. K. Lee
Affiliation:
School of Health and Related Research, University of Sheffield, UK
*
Correspondence and reprint requests to Ms Mary Hall, School of Health and Related Research, University of Sheffield, 30 Regent St, Sheffield, S1 4DA, UK (e-mail: [email protected]).

Abstract

Objective

While carrying out a scoping review of earthquake response, we found that there is no universal standardized approach for assessing the quality of disaster evidence, much of which is variable or not peer reviewed. With the lack of a framework to ascertain the value and validity of this literature, there is a danger that valuable insights may be lost. We propose a theoretical framework that may, with further validation, address this gap.

Methods

Existing frameworks – quality of reporting of meta-analyses (QUORUM), meta-analysis of observational studies in epidemiology (MOOSE), the Cochrane assessment of bias, Critical Appraisal Skills Programme (CASP) checklists, strengthening the reporting of observation studies in epidemiology (STROBE), and consensus guidelines on reports of field interventions in disasters and emergencies (CONFIDE)–were analyzed to identify key domains of quality. Supporting statements, based on these existing frameworks were developed for each domain to form an overall theoretical framework of quality. This was piloted on a data set of publications from a separate scoping review.

Results

Four domains of quality were identified: robustness, generalizability, added value, and ethics with 11 scored, supporting statements. Although 73 out of 111 papers (66%) scored below 70%, a sizeable portion (34%) scored higher.

Conclusion

Our theoretical framework presents, for debate and further validation, a method of assessing the quality of non-traditional studies and thus supporting the best available evidence approach to disaster response. (Disaster Med Public Health Preparedness. 2019;13:147–151)

Type
Brief Report
Copyright
Copyright © Society for Disaster Medicine and Public Health, Inc. 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1. Knox Clarke, P, Darcy, J. Insufficient Evidence? The Quality of Use of Evidence in Humanitarian Action. ALNAP Study. London: ALNAP/ODI; 2014.Google Scholar
2. Challen, K, Lee, ACK, Booth, A, et al. Where is the evidence for emergency planning: a scoping review. BMC Public Health. 2012;12:542.Google Scholar
3. Bradt, DA. Evidence-based decision making (part 1): origins and evolution in the health sciences. Prehosp Disaster Med. 2009;24(4):298-304.Google Scholar
4. Bradt, DA, Aitken, P. Disaster medicine reporting: the need for new guidelines and the CONFIDE statement. Emerg Med Australas. 2010;22:483-487.Google Scholar
5. Cartwright, C, Hall, M, Lee, ACK. The changing health priorities of earthquake response and implications for preparedness: a scoping review. Public Health. 2017;150:60-70.Google Scholar
6. Moher, D, Cook, DJ, Eastwood, S, et al. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUORUM statement. Br J Surg. 2000;87:1448-1454.Google Scholar
7. Stroup, DF, Berlin, JA, Morton, SC, et al. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis of Observational Studies in Epidemiology (MOOSE) Group. JAMA. 2000;283:2008-2012.Google Scholar
9. CASP checklists. Published 2018. http://www.casp-uk.net/casp-tools-checklists. Accessed February 27, 2017.Google Scholar
10. von Elm, E, Altman, DG, Egger, M, et al. Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335:806-808.Google Scholar
11. De Brún, C. Finding the evidence: a key step in the information production process. The Information Standard; 2013. https://www.england.nhs.uk/wp-content/uploads/2017/02/tis-guide-finding-the-evidence-07nov.pdf. Accessed October 11, 2017.Google Scholar
12. Evidence Aid. Published 2018. https://www.evidenceaid.org. Accessed May 10, 2018.Google Scholar
13. Disaster Information Management Resource Center. https://www.disaster.nlm.nih.gov. Accessed May 10, 2018.Google Scholar