Article contents
OP71 Evidence Grading Systems Used In Health Technology Assessment Practice
Published online by Cambridge University Press: 03 January 2019
Abstract
To facilitate moving from research findings to conclusions when conducting systematic reviews (SRs) and health technology assessments (HTAs), evidence grading systems (EGSs) have been developed to assess the quality of bodies of evidence and communicate (un)certainty about the effects of evaluated technologies. Use of EGSs has become an essential step in conducting SRs and HTAs and those relying on review conclusions should be aware of EGSs’ potential limitations.
This study aims to identify EGSs used in SR and HTA practice, and summarize findings on their inter-rater reliability (IRR). Relevant sources were searched to identify EGSs used in recently published SRs and IRR studies of available EGSs. Members of the International Network of Agencies for Health Technology Assessment were surveyed regarding their current approaches.
Preliminary results indicate that only two conceptually similar EGSs are currently used by several organizations in SR and HTA practice: (i) the Grading of Recommendations Assessment, Development and Evaluation (GRADE) and (ii) the Agency for Healthcare Research and Quality Evidence-based Practice Center Program (AHRQ-EPC). Both approaches emphasize a structured and transparent method. However, results from published IRR studies suggest there is a risk for variability in their application due to researchers’ diverse levels of training and experience in using them, and the complexity and heterogeneity of evidence in SRs.
Validated EGSs can play a critical role in whether and how research findings are eventually translated into practice. However, our results indicate a low level of uptake of EGSs in HTA practice. Both currently used EGSs are susceptible to misuse that allows different researchers to grade the same body of evidence differently, and their performance has not been robustly explored in terms of IRR. If these results stand up to replication, one cannot rely on conclusions of published SRs, which has implications for the decisions they inform.
- Type
- Oral Presentations
- Information
- Copyright
- Copyright © Cambridge University Press 2018
- 1
- Cited by