Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-25T05:24:23.694Z Has data issue: false hasContentIssue false

Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment

Published online by Cambridge University Press:  13 October 2010

Christina Niederstadt
Affiliation:
Medical Review Board of the German Statutory Health Insurances Lower Saxony (MDK Niedersachsen)
Sigrid Droste
Affiliation:
Institute for Quality and Efficiency in Health Care (IQWiG)

Abstract

Background: Information retrieval (IR) in health technology assessment (HTA) calls for transparency and reproducibility, but common practice in the documentation and presentation of this process is inadequate in fulfilling this demand.

Objectives: Our objective is to promote good IR practice by presenting the conceptualization of retrieval and transcription readable to non-information specialists, and reporting of effectively processed search strategies.

Methods: We performed a comprehensive database search (04/2010) to synthesize the current state-of-the-art. We then developed graphical and tabular presentation methods and tested their feasibility on existing research questions and defined recommendations.

Results: No generally accepted standard of reporting of IR in HTA exists. We, therefore, developed templates for presenting the retrieval conceptualization, database selection, and additional hand-searching as well as for presenting search histories of complex and lengthy search strategies. No single template fits all conceptualizations, but some can be applied to most processes. Database interface providers report queries as entered, not as they are actually processed. In PubMed®, the huge difference between entered and processed query is shown in “Details.” Quality control and evaluation of search strategies using a validated tool such as the PRESS checklist is suboptimal when only entry-query based search histories are applied.

Conclusions: Moving toward an internationally accepted IR reporting standard calls for advances in common reporting practices. Comprehensive, process-based reporting and presentation would make IR more understandable to others than information specialists and facilitate quality control.

Type
THEME SECTION: INFORMATION RETRIEVAL FOR HTA
Copyright
Copyright © Cambridge University Press 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1. Booth, A. “Brimful of STARLITE”: Toward standards for reporting literature searches. J Med Libr Assoc. 2006;94:421429, e205.Google ScholarPubMed
2. Busse, R, Orvain, J, Velasco, M, et al. Best practice in undertaking and reporting health technology assessments: Working group 4 report. Int J Technol Assess Health Care. 2002;18:361422.CrossRefGoogle Scholar
3. Canadian Agency for Drugs and Technologies in Health. PRESS: Peer review of electronic search strategies. Ottawa: CADTH; 2008.Google Scholar
4. Centre for Evidence Based Medicine (CEBM), University of Oxford. Asking focused questions. http://www.cebm.net/index.aspx?o=1036 (accessed May 11, 2010).Google Scholar
5. Centre for Reviews and Dissemination. Systematic reviews: CRD's guidance for undertaking reviews in health care. York: CRD, University of York; 2009.Google Scholar
6. Danish Centre for Health Technology Assessment. Health technology assessment handbook. Copenhagen: DACEHTA; 2008.Google Scholar
7. Etutorials.org. Learning UML. http://etutorials.org/programming/learning+uml/ (accessed May 11, 2010).Google Scholar
8. Golder, S, Loke, Y, McIntosh, HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61:440448.CrossRefGoogle ScholarPubMed
9. International Network of Agencies for Health Technology Assessment. A checklist for health technology assessment reports. Stockholm: INAHTA; 2007.Google Scholar
10. Kitchenham, B. Procedures for performing systematic reviews. Eversleigh: National Information and Communications Technology Centre of Excellence Australia (NICTA); 2004.Google Scholar
11. Liberati, A, Altman, DG, Tetzlaff, J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. PLoS Med. 2009;6:e1000100.CrossRefGoogle ScholarPubMed
12. Patrick, TB, Demiris, G, Folk, LC, et al. Evidence-based retrieval in evidence-based medicine. J Med Libr Assoc. 2004;92:196199.Google Scholar
13. Roundtree, AK, Kallen, MA, Lopez-Olivo, MA, et al. Poor reporting of search strategy and conflict of interest in over 250 narrative and systematic reviews of two biologic agents in arthritis: A systematic review. J Clin Epidemiol. 2009;62:128137.CrossRefGoogle ScholarPubMed
14. Sampson, M, McGowan, J, Tetzlaff, J, Cogo, E, Moher, D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61:748754.CrossRefGoogle ScholarPubMed
15. Sandelowski, M, Barroso, J. Handbook for synthesizing qualitative research. New York: Springer; 2007.Google Scholar
16. Shea, BJ, Grimshaw, JM, Wells, GA, et al. Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.CrossRefGoogle ScholarPubMed
17. Spoerri, A. InfoCrystal: A visual tool for information retrieval. In: Card, SK, MacKinlay, JD, Shneiderman, B eds. Readings in information visualization: Using vision to think. San Diego: Academic Press; 1999.Google Scholar
18. Stroup, DF, Berlin, JA, Morton, SC, et al. Meta-analysis of observational studies in epidemiology: A proposal for reporting. JAMA. 2000;283:20082012.CrossRefGoogle ScholarPubMed
19. The AGREE Collaboration. Appraisal of Guidelines for Research and Evaluation: AGREE instrument. London: St George's Hospital Medical School; 2001.Google Scholar
20. The Campbell Collaboration Steering Committee, ed. The Campbell Collaboration information retrieval policy brief. Oslo: The Campbell Collaboration; 2004.Google Scholar
21. The Campbell Collaboration. Systematic review information retrieval checklist: Revised 13/02/2009. Oslo: The Campbell Collaboration; 2009.Google Scholar
22. The Cochrane Collaboration. Cochrane handbook for systematic reviews of interventions: Version 5.0.2. Oxford: The Cochrane Collaboration; 2009.Google Scholar
23. Yoshii, A, Plaut, DA, McGraw, KA, Anderson, MJ, Wellik, KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. J Med Libr Assoc. 2009;97:2129.CrossRefGoogle ScholarPubMed
Supplementary material: File

Niederstadt and Droste supplementary material

Tables and figures

Download Niederstadt and Droste supplementary material(File)
File 200 KB