Hostname: page-component-7bb8b95d7b-dvmhs Total loading time: 0 Render date: 2024-10-06T23:35:47.198Z Has data issue: false hasContentIssue false

PD24 Robust Real-World Evidence Generation In Comparative Effects Studies – NICE’s Methods Guidance

Published online by Cambridge University Press:  23 December 2022

Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.
Introduction

Recent reviews have shown that many real-world evidence (RWE) studies suffer from avoidable methodological flaws. Meanwhile, the National Institute for Health and Care Excellence (NICE) is seeing an increase in RWE submissions in Health Technology Appraisals and is keen to support the use of this evidence. However, limited guidance exists for the development and assessment of RWE, risking both missed opportunities for unbiased evidence generation and inconsistent decision making based on that evidence. As part of its RWE framework, NICE has developed methods guidance to provide clear expectations for the conduct and reporting of non-randomized comparative effects studies using real world data.

Methods

A conceptual model and draft framework were developed based on established international best practices in RWE and observational research. This was refined with focused literature searches, for example, on the use of external control arm studies. We then engaged with external stakeholders to incorporate their feedback and develop case studies. A reporting template was developed and tested on multiple use cases.

Results & Conclusions

The guidance stresses the central importance of a target trial approach to study design, e.g., adopting an active comparator, new user design, where possible. Target trial emulation is a useful tool to improve the quality and transparency of RWE studies, helping to overcome selection and confounding biases. Various other study design and analytical approaches are outlined for addressing confounding bias and biases due to missing data, measurement error, or misclassification, which are common challenges in RWE. Alongside traditional approaches to sensitivity analysis, the framework promotes quantitative bias analyses which includes a range of methods to assess and communicate the potential impact of remaining bias to study findings by quantifying the direction, magnitude, and uncertainty of bias. A reporting template, based on common methodological pitfalls, is provided to help evidence developers consider key areas of bias in their work and to inform reviewers of any approaches used to investigate or resolve these.

Type
Poster Debate
Copyright
© The Author(s), 2022. Published by Cambridge University Press