Hostname: page-component-586b7cd67f-t7czq Total loading time: 0 Render date: 2024-11-27T21:11:25.382Z Has data issue: false hasContentIssue false

Considerations for transferability of health technology assessments: a scoping review of tools, methods, and practices

Published online by Cambridge University Press:  02 November 2022

Lieke Fleur Heupink*
Affiliation:
Norwegian Institute of Public Health, Global Health, Division for Health Services Oslo, Norway
Elizabeth Fleur Peacocke
Affiliation:
Norwegian Institute of Public Health, Global Health, Division for Health Services Oslo, Norway
Ingvil Sæterdal
Affiliation:
Norwegian Institute of Public Health, Global Health, Division for Health Services Oslo, Norway
Lumbwe Chola
Affiliation:
Norwegian Institute of Public Health, Global Health, Division for Health Services Oslo, Norway
Katrine Frønsdal
Affiliation:
Norwegian Institute of Public Health, Global Health, Division for Health Services Oslo, Norway
*
*Author for correspondence: Lieke Fleur Heupink, E-mail: liekefleur.heupink@fhi
Rights & Permissions [Opens in a new window]

Abstract

Health technology assessment (HTA) is commonly used to guide evidence-informed decisions to optimize resource use, prioritize policies, and support countries to achieve universal health coverage. Producing HTAs requires time, scientific expertise, and political commitment, but these are not available in all settings – especially in low- and middle-income countries (LMIC) where HTA processes may be less institutionalized. Transferring and adapting existing HTAs to local settings may offer a solution while reducing duplication efforts. This scoping review aims to provide an overview of tools, methods, approaches, and considerations which can aid HTA transfers. We systematically searched (from 2005 to 2020) six databases and, using predefined inclusion criteria, included twenty-two studies. Data extraction followed a structured process, while synthesis was more iterative. We identified a common approach for HTA transfers. It follows the de novo process of undertaking original HTAs, but with additional steps to assess relevance (applicability), quality, and transferability, as well as steps to adapt parameters where necessary. The EUnetHTA Adaptation Toolkit was the only tool that provided guidance for adapting multiple HTA domains. Other tools were specific to systematic reviews (n = 1) or economic evaluations (n = 12), where one provided guidance for systematic reviews of economic evaluations. Eight papers reported transferring an HTA, with only one transferring to an LMIC. Finally, we reported issues that may facilitate or hinder transferability. In conclusion, we identified fourteen transfer approaches in the form of guidance or checklists, but harmonized and pragmatic guidance for HTA transfers to suit settings with limited HTA capacity seems warranted.

Type
Article Commentary
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Introduction

Health technology assessment (HTA) can facilitate transparent, accountable, and evidence-based decisions that support equitable and efficient allocation of healthcare resources (Reference O’Rourke, Oortwijn and Schuller1). However, producing HTAs requires considerable time and resources, and reusing existing HTAs can be helpful, particularly for urgent policy decisions (Reference Nemzoff, Ruiz and Chalkidou2). The process of transferring existing HTAs between different settings (i.e., countries, regions, or HTA agencies) could then accelerate the production of HTAs, as well as; reduce duplication, improve knowledge sharing, and aid countries with fewer resources for HTA (Reference Bijlmakers, Mueller, Kahveci, Chen and van der Wilt3Reference Augustovski, Iglesias and Manca5).

Generally, HTA production follows four steps, namely “Topic Identification, Selection and Prioritization (TISP),” “HTA analysis,” “Appraisal,” and “Implementation” (6). There is consensus that stakeholders (i.e., patients, providers, policymakers, etc.) should be involved in each step of the HTA process, as this brings many benefits, such as the inclusion of relevant outcomes regarding the quality of life and cost-effectiveness, understanding uncertainty of data and results, and further implementation (Reference Pinho-Gomes, Stone and Shaw7;Reference Trowman, Powers and Ollendorf8). This is especially true when transferring HTAs where most or all evidence is not from local sources and the involvement of stakeholders will ensure local relevance. As such, HTAs may inform a broad range of decision-making processes, including reimbursement, coverage, procurement, clinical guidelines, priority setting, and public health programs.

The end-products of HTA processes often vary but may be divided into four main types: HTA reports or full HTAs; mini-HTAs; rapid assessments; and national appraisals. Each can support different decision-making processes depending on the context; local requirement; technology under assessment; and the type of decisions (Reference Merlin, Tamblyn and Ellery9). HTA reports are the most comprehensive, describing the health technology and its current use, evaluating its clinical effectiveness and safety, and often including an economic evaluation. HTA reports may include additional parts, or so-called domains, that assess Ethical, Social, Legal, or Organizational aspects (ESLO) (10). The variability of HTA content and format might benefit local decision-making but limits the ease of HTA transfers.

When considering HTA transfer, it is important to determine whether the underlying evidence is relevant (applicable), generalizable, or transferable. There is currently no consensus on how to define these terms (Table 1) (Reference Burford, Lewin, Welch, Rehfuess and Waters11). Berg et al. (Reference Berg, Cheung and Hiligsmann12) proposed two steps to guide the transferability of HTAs. First, an applicability, or relevance, assessment investigates if “the health technologies considered in the HTA can be replicated and evaluated in the new context?.” This is followed by a generalizability assessment, a mostly scientific effort, that considers whether “the effect size found in the HTA will be retained when implemented in the new context?” (Reference Berg, Cheung and Hiligsmann12). If an HTA is applicable and aligns in its current form, it is considered “generalizable.” However, more often interventions included in HTAs are applicable and transferable to the setting, but the evidence is not generalizable. Some HTA domains are inherently more generalizable or transferable than others. Findings from clinical trials used to assess efficacy and safety are often generalizable if target populations and conditions are similar, whereas costs and resource use in economic evaluations, as well as the ESLO domains, are often not generalizable nor easily transferable, adding to the complexity of transferring HTAs (Reference Barbieri, Drummond and Rutten13).

Table 1. Definitions of transferability and generalizability

Note: The EUnetHTA Glossary provides various definitions for transferability and generalizability, because there is no consensus among the various country partners – here, we present the definition defined in the EUnetHTA Adaptation Toolkit.

Despite issues related to transferring evidence to a new context, most HTA agencies already reuse existing evidence to respond to local needs. The European Network for HTA (EUnetHTA) has explored ways of standardizing methods and reporting and HTA transferability, resulting in products like the HTA Core Model (10), a methodological framework to collaboratively produce and share HTAs, which can indicate the level of transferability for each assessment element (10), and the EUnetHTA Adaptation Toolkit, which can aid HTA agencies to transfer HTAs produced elsewhere (Reference Guegan, Milne and Pordage14). The objective of this review is to provide an overview of HTA methods, approaches, or tools like the EUnetHTA toolkit to transfer HTAs. We also investigated other issues that might aid or hinder HTA transfers specifically in low- and middle-income countries (LMIC).

Methods

We followed guidelines for scoping reviews by Arksey and O’Malley (Reference Arksey and O’Malley15) and Peters et al. (Reference Peters, Godfrey and Khalil16). We aimed to answer three research questions: (i) What methods, including approaches, checklists, or tools, are available for the adaptation and adoption of HTAs?; (ii) What factors of these methods aid or hinder the process of integrating an existing HTA in a new context?; (iii) Are there any methodological gaps associated with these methods, specifically for LMICs? The protocol is available on the Norwegian Institute of Public Health (NIPH) website (Reference Heupink, Chola, Peacocke, Frønsdal and Sæterdal17). We opted for an iterative process to allow for flexibility when reporting our findings. We made minor changes to the data charting (Supplementary Material S1) and carried out data synthesis and reporting related to the main themes identified in the included articles.

Eligibility criteria

We were primarily interested in three types of studies, namely studies that transferred an HTA from one setting to another, along with a description of experiences arising; studies that described a tool, approach, or method for transferring an existing HTA, including an explanation of how the tool could be used in practice; and systematic reviews of various tools for HTA transfers. There were no restrictions applied to the type of HTA report, and the HTAs could include multiple domains or only a specific domain such as clinical effectiveness, safety, or economic evaluation. We excluded studies providing methods that mainly reused HTA recommendations and not the underlying assessments. For instance, Multiple-Criteria Decision Analysis (MCDA) or deliberative processes for priority setting are tools primarily developed to be used by decision-makers (HTA users) and were considered outside of the “HTA-analysis” step, as well as mini-HTAs, hospital HTAs, and HTA fast-tracking. Lastly, studies needed to be accessible in full text, related to public health, and published in English after 2005 to be eligible for inclusion.

Search strategy

To identify relevant literature, an information specialist helped to develop the search strategy and performed literature searches in November 2020 in six electronic databases: Ovid Medline, Embase, Cochrane Database of Systematic Reviews, Scopus, Epistemonikos, and the Cochrane Methods Methodology Register (Supplementary Material S2). We did not search in reference lists nor for grey literature but included conference abstracts from the past three years. Additionally, we sent a short questionnaire to members of the International Network of Agencies for Health Technology Assessment (INAHTA) about whether they knew or used any transfer tools or had any transfer experiences, in order to identify tools in development or not publicly available (Supplementary Material S3).

Study selection, data extraction, and charting

Articles fulfilling the inclusion criteria were uploaded to Covidence for title and abstract screening performed by four reviewers (LF, EP, IS, and LC), where at least two reviewers independently screened references. EP and LF read and screened the full texts, while IS and LC verified their decisions. Disagreements were resolved through discussion or by consulting a third independent reviewer. EP and LF used Microsoft Excel (2016) to document, extract, and chart data. Items on the standardized data extraction sheet were guided by the research questions (Supplementary Material S1). We extracted or summarized data on all relevant items from each article. Afterwards, articles were grouped per HTA domain(s) and the data were narratively summarized. The strengths and weaknesses were identified, thematically grouped, and described, with specific attention to LMICs.

Results

The search strategy identified 1,323 unique references. Following screening, we read seventy-two full texts, of which nineteen articles were eligible for inclusion (Figure 1). We included three additional articles, because the search strategy identified only one of three articles from an article series (Reference Thielen, Van Mastrigt and Burgers18Reference Wijnen, Van Mastrigt and Redekop20), and one article on the TRANSFER approach was published after the search (Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21). The TRANSFER approach was developed following a scoping review (Reference Munthe-Kaas, Nokleby and Nguyen22) identified from the search strategy. Excluded studies and the reasons for exclusion are listed in Supplementary Material S4.

Figure 1. PRISMA flowchart of study selection process. The number of records identified per database are Ovid Medline (n = 464), Embase (n = 519), Cochrane Database of Systematic Reviews (n = 58), Scopus (n = 685), Epistemonikos (n = 304), and Cochrane Methods Methodology Register (n = 101).

.

The eligible twenty-two articles referred to HTA transferability in their approach, method, or proposed tool (Supplementary Material S5). Eight articles described their experience of conducting an HTA transfer using a tool (Reference Gorry, McCullagh and Barry23Reference Alshreef, MacQuilkan and Dawkins30), twelve articles described transferability tools (Reference Turner, Chase and Milne4;Reference Thielen, Van Mastrigt and Burgers18Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21;Reference Chase, Rosten, Turner, Hicks and Milne31Reference Brozek, Canelo-Aybar and Akl37), and two articles provided an overview of tools aiding transferability (Reference Munthe-Kaas, Nokleby and Nguyen22;Reference Goeree, He and O’Reilly38). No additional tools were identified by the INAHTA questionnaire (seven respondents) or among the conference abstracts (Supplementary Material S3).

The included articles were heterogeneous in their scope, outcomes, and approaches to HTA transferability. Firstly, we focused on the structure of the HTA transfer process (Figure 2). Then, we summarized three transferability approaches that guided the whole HTA process, namely: the EUnetHTA Adaptation Toolkit (Reference Guegan, Milne and Pordage14) along with the derived Model for Assessment of Telemedicine (MAST) (Reference Kidholm, Ekeland and Jensen34); the TRANSFER approach for systematic reviews (Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21); and systematic reviews of economic evaluations (Reference Thielen, Van Mastrigt and Burgers18Reference Wijnen, Van Mastrigt and Redekop20). Next, we listed relevant “catalyst” tools that may aid transferability assessments in economic evaluations (Table 2). We then discussed key issues that affect HTA transfers.

Figure 2. “Common» structure of HTA transfers.

Table 2. Tools to assess or guide transferability of economic evaluations

Note: We excluded the SBU checklists (47) and checklists not being specific to transferability (e.g., the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) checklist (Reference Husereau, Drummond and Augustovski49) or model quality checklists as Caro et al. (Reference Caro, Eddy and Kan42) and Philips et al. (Reference Philips, Bojke, Sculpher, Claxton and Golder41)).

The “common” structure of HTA transfers

There is no consensus on how to structure HTA transfers; however, our findings indicated overlap and common features between HTA transfers and processes for de novo HTA production. Figure 2 illustrates the overall structure of an HTA transfer. To summarize, after having decided to opt for an HTA transfer (step 1), a relevant HTA or research underlying a specific HTA domain needs to be identified. This process commonly uses pre-defined selection criteria and follows procedural recommendations as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement (Reference Page, McKenzie and Bossuyt39). The checklist by Grutters et al. (Reference Grutters, Seferina and Tjan-Heijnen40) can help to determine and frame the relevance, or applicability, of an HTA to the identified research question (step 2). The identified HTA(s) is then assessed for methodological quality, risk of bias, and transferability (step 3). Step 3 is crucial in the transfer process, as transferring a low-quality product might lead to ambiguous, misleading, or even wrong conclusions. At this step, approaches start to diverge, for instance in the order of the quality, risk of bias, or transferability assessments. Some approaches even fuse steps 2 and 3, combining applicability and partial quality assessments into one. What most tools have in common, however, is that they prepare for transfers like a catalyst in a chemical reaction. An additional step might be added to identify transferability factors, which refers to factors that need to be adapted or replaced with local data. After identifying transferability factors, and when necessary, adapting or replacing them (sub-step 3), relevant data can be extracted (step 4). Then, the results are reported (step 5), discussed, and interpreted (step 6). Finally, the appraisal (step 7) and implementation in the local setting (step 8) conclude the HTA transfer process.

EUnetHTA adaptation toolkit

The EUnetHTA Adaptation Toolkit consists of two sections (Reference Turner, Chase and Milne4;Reference Guegan, Milne and Pordage14;Reference Chase, Rosten, Turner, Hicks and Milne31Reference Kristensen, Lampe and Chase33). First, eight speedy sifting questions (on topic, language, description of the technology, methods, scope, peer review, conflict of interest, and timeliness) appraise the applicability of the identified HTA. After deciding to continue the adaption process, the second section includes questions on relevance, reliability, and transferability for five “transferable” HTA domains (use of technology, safety, effectiveness, economic evaluation, and organization) (Reference Guegan, Milne and Pordage14). The MAST Application is derived from the Adaptation Toolkit, but with guidance questions specific to HTAs on telemedicine applications (Reference Kidholm, Ekeland and Jensen34).

Macpherson and Thompson (Reference Macpherson and Thompson24) describe their experiences using the toolkit to adapt two relative effectiveness assessments produced by EUnetHTA to inform decision-making in the National Health Service in Scotland. The modular structure of the toolkit makes it possible to use the whole or only parts thereof (Reference Guegan, Milne and Pordage14); and Gorry, McCullagh, and Barry (Reference Gorry, McCullagh and Barry23) used questions from the economic evaluation domain to assess the generic and specific transferability of published economic evaluations on treatments for advanced melanoma to the Irish setting.

TRANSFER approach guides interpretation of transferability in systematic reviews

The TRANSFER approach mirrors the standard systematic review process in seven stages and provides guidance to assess and integrate the transferability of review findings. For instance, templates for collaborating with decision-makers and identifying context-specific “transfer” factors are included (Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21).

Systematic reviews of economic evaluations

A systematic review of economic evaluations collects economic evidence about specific health interventions to inform evidence-based decisions. The article series by van Mastrigt et al. (Reference Thielen, Van Mastrigt and Burgers18Reference Wijnen, Van Mastrigt and Redekop20) describes the process, with expert best-practice recommendations for each step, as follows: (i) initiate the review; (ii) identify and select economic evaluations; (iii) data extraction and assessment of the risk of bias and transferability; (iv) reporting results; (v) discussion and interpretation of results. Economic evaluations must “pass” the quality and risk of bias assessment before an assessment of transferability is relevant. Wijnen et al. (Reference Wijnen, Van Mastrigt and Redekop20) provide a comprehensive overview of checklists that can facilitate these quality assessments. Notably, they recommend the Philips Checklist (Reference Philips, Bojke, Sculpher, Claxton and Golder41) for assessing the risk of bias in model-based economic evaluation. However, the Philips Checklist is rather long, so, when over ten economic evaluations are under review, the checklist by Caro et al. (Reference Caro, Eddy and Kan42) was recommended. For trial-based economic evaluations, Wijnen et al. (Reference Wijnen, Van Mastrigt and Redekop20) recommended the BMJ (Reference Drummond and Jefferson43) or CHEC-extended (Reference Evers, Goossens, De Vet, Van Tulder and Ament44) checklist. Further, identified economic evaluations may be compared to local guidelines to fit methodological standards and ensure relevancy. The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) has collected thirty-three country-specific guidelines that may be used for this purpose (Reference Eldessouki and Smith45) [access at: https://tools.ispor.org/peguidelines/]. Lastly, Wijnen et al. (Reference Wijnen, Van Mastrigt and Redekop20) highlighted the Welte checklist (Reference Welte, Feenstra, Jager and Leidl46) as a convenient transferability checklist.

The search strategy identified three papers conducting systematic searches to find and transfer economic evaluations. Checklists were primarily used to assess if results could be adopted (referring to a direct transfer without any changes) in the local setting. Only the article by Nystrand et al. (Reference Nystrand, Gebreslassie, Ssegonja, Feldman and Sampaio25) used the “Van Mastrigt” guidance (Reference Thielen, Van Mastrigt and Burgers18Reference Wijnen, Van Mastrigt and Redekop20). They used multiple tools for the transferability assessments, including a country-specific checklist from the Swedish HTA Agency (SBU) that covered transferability items other checklists did not consider (e.g., the feasibility of implementation, standard practice in the comparator, and health financing mechanism) (47). Van Haalen, Severens, Tran-Duy, and Boonen used a combination of checklists and approaches like the disease-specific reference case, the Welte checklist (Reference Welte, Feenstra, Jager and Leidl46), the Drummond-ISPOR approach (Reference Drummond, Barbieri and Cook48), and the Phillips checklist (Reference Philips, Bojke, Sculpher, Claxton and Golder41). Ruggeri et al. (Reference Ruggeri, Drago, Rosiello, Orlando and Santori27) used pre-defined criteria and the Augustovski checklist (Reference Augustovski, Iglesias and Manca5) to assess generalizability. All studies found that economic evaluations could be transferred although adaptations were required, for instance replacing unit costs.

Catalyst transfer tools for economic evaluations

Another way to transfer economic evaluations is exemplified by the more ad-hoc approaches that identified or had access to existing economic evaluations and adapted underlying models to fit the new setting (Reference Ademi, Tomonaga and van Stiphout28Reference Alshreef, MacQuilkan and Dawkins30). These approaches followed the steps from Figure 2, using a variety of tools to assess applicability, quality, bias, or transferability. Ademi et al. (Reference Ademi, Tomonaga and van Stiphout28) used multiple tools and adapted the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) checklist (Reference Husereau, Drummond and Augustovski49) to predefine exclusion criteria for the selection process (step 2), while Essers et al. (Reference Essers, Seferina and Tjan-Heijnen29) used the knock-out criteria in the Welte checklist (Reference Welte, Feenstra, Jager and Leidl46) as a quick-sifting tool. If economic evaluations failed to report items from the CHEERS (Reference Husereau, Drummond and Augustovski49) or Welte checklist (Reference Welte, Feenstra, Jager and Leidl46), they were deemed non-transferable and excluded. After relevant economic evaluations were identified and selected, Ademi et al. (Reference Ademi, Tomonaga and van Stiphout28) used quality assessments to judge the methods’ characteristics and other aspects, while neither Alshreef et al. (Reference Alshreef, MacQuilkan and Dawkins30) nor Essers et al. (Reference Essers, Seferina and Tjan-Heijnen29) included quality assessments. Both used models previously applied in another context, and this validation was considered appropriate. However, Essers et al. (Reference Essers, Seferina and Tjan-Heijnen29) did use the Urdahl (Reference Urdahl, Manca and Sculpher50) and EURONHEED Checklist (Reference Boulenger, Nixon and Drummond35), and Alshreef et al. (Reference Alshreef, MacQuilkan and Dawkins30) used the Mullins checklist (Reference Mullins, Onwudiwe and Branco de Araújo51) to identify factors that could limit the transfer (step 3). Identified factors included costing parameters, resource use, discount rates, and outcome measures (i.e., mortality, utility, and health-related quality of life). All papers adjusted factors that could limit transferability by replacing them with local data if available. Otherwise, expert assumptions or international studies were used. Table 2 provides an overview the identified checklist that can aid the transfers of economic evaluations.

HTA transferability issues to consider

The articles included in this mapping all reported on factors and issues that may affect the transferability of HTAs. For instance, it is essential to identify and access relevant HTAs, research papers, or models to avoid duplication (Reference Wijnen, Van Mastrigt and Redekop20;Reference Gorry, McCullagh and Barry23;Reference Nystrand, Gebreslassie, Ssegonja, Feldman and Sampaio25;Reference Ademi, Tomonaga and van Stiphout28;Reference Essers, Seferina and Tjan-Heijnen29;Reference Goeree, He and O’Reilly38). Differences in the context of the technology under review (i.e., resources available, the standard of care, type of health system, and population) further hinder the transferability of HTAs (Reference Alshreef, MacQuilkan and Dawkins30;Reference Kidholm, Ekeland and Jensen34;Reference Brozek, Canelo-Aybar and Akl37). Replacing key parameters with local data can help to ensure relevant estimates, but the country-specific data are often lacking in resource-limited settings. Sensitivity analyses and other estimation techniques could help fill any data gaps, but will require knowledge and skills (Reference Kidholm, Ekeland and Jensen34;Reference Boulenger, Nixon and Drummond35;Reference Goeree, He and O’Reilly38). Additionally, knowledge of HTA, and specifically appraisal skills, is essential. HTAs that use poor-quality data will only have limited use for a transfer, as this might perpetuate poor HTA recommendations (Reference Turner, Chase and Milne4;Reference Chase, Rosten, Turner, Hicks and Milne31;Reference Turner, Chase and Milne32). Lastly, differences in HTA practices also affect the ease of transferability. Hence, the challenges of HTA transfers should not be underestimated as they still require time, scientific expertise, and other resources.

Table 3 presents an overview of all the identified transferability issues, which we summarized in five overarching themes: (i) lack of access to original studies or models, (ii); poor quality of HTAs, underlying studies, and models; (iii) lack of local data limits possibility to adapt existing HTAs; (iv) local HTA requirements ensure relevancy, but diversify end-products; and (v) significant resources are required to conduct HTA transfers.

Table 3. Factors that may influence the transferability of HTAs

HTA, health technology assessment; PRISMA, preferred reporting items for systematic reviews and meta-analyses.

Discussion

This scoping review identified the six steps commonly followed when transferring HTAs to a new context (Figure 2). After initiating (step 1), searching, and selecting (step 2) an applicable HTA, one or more tools are used to assess the transferability (step 3). These tools, sometimes adapted from their original purpose (e.g., CHEERS being commonly used to ensure comprehensive reporting (Reference Husereau, Drummond and Augustovski49)), are then used to assess the quality and transferability of the HTA. Methods are available to identify, adapt, or replace transferability factors when evidence is not generalizable, or transferability can be improved (step 3b). The guidance for the steps following these assessments, for instance, on how to interpret added uncertainties of transferability, is limited (steps 4–6). Nor is there a clear consensus on when to use which tools to assess transferability (Reference Nystrand, Gebreslassie, Ssegonja, Feldman and Sampaio25;Reference Goeree, He and O’Reilly38;Reference Korber52). Most of the identified tools were checklists relevant for economic evaluations, while only a limited number of tools were suitable for systematic reviews that assessed the clinical effectiveness and safety of health interventions. The EUnetHTA Adaptation Toolkit and the derived MAST application were the only tools including checklists for multiple HTA domains. However, no guidance was provided for the ESLO domains, as these were deemed less appropriate for transferability given that they are highly contextual and require local information (Reference Turner, Chase and Milne4;Reference Chase, Rosten, Turner, Hicks and Milne31Reference Kristensen, Lampe and Chase33).

We identified various issues hindering the transferability of HTAs. Most importantly, HTAs inform decisions locally and, for diverse reasons, existing HTAs may not fit with the settings’ needs, policy questions, or methodological standards. Many of the approaches identified in this review tried to overcome these issues. For instance, Grutters et al. (Reference Grutters, Seferina and Tjan-Heijnen40) provided a checklist that can be used to assess the applicability of HTAs to the decision problem. Secondly, the TRANSFER approach highlighted the benefit of engaging stakeholders early as they can play an important role in refining research questions and identifying those factors that affect transferability. Additionally, the TRANSFER approach allows to run a sub-analysis specific to these transferability factors to obtain estimates relevant to the local setting. (Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21). Thirdly, country guidelines or reference cases (e.g., SBU checklist (47)) could help to match existing HTAs to local requirements and needs. Likewise, disease-specific reference cases may aid transfers (Reference Nystrand, Gebreslassie, Ssegonja, Feldman and Sampaio25). Fourthly, local data can replace parameters in transferred models or HTAs. For instance, ISPOR recommends minimally replacing price in economic evaluations (Reference Drummond, Barbieri and Cook48). Both quantitative and qualitative methods have tried to identify those transferability factors which significantly affect outcomes (Reference Perrier, Buja and Mastrangelo53;Reference Goeree, Burke and O’Reilly54). Goeree et al. (Reference Goeree, Burke and O’Reilly54) identified over seventy-seven transferability factors showing that adapting parameters in economic evaluations is still a complex task. Lastly, even with existing databases, such as the INAHTA HTA database, Cochrane library, TUFTS database for economic evaluations, and many more, access to relevant evidence is still an issue in HTA transfers.

We identified no transfers between LMICs and only one paper described a transfer to an LMIC (Reference Alshreef, MacQuilkan and Dawkins30). However, limited capacities and potentially high up-front costs for conducting and implementing HTAs might be disincentives to institutionalizing and producing HTAs and HTA transfers (Reference Nemeth, Goettsch and Kristensen55). None of the tools identified in this review provided pragmatic guidance on how to overcome these challenges. Collaborations illustrated by Alshreef et al. (Reference Alshreef, MacQuilkan and Dawkins30) might help to bridge the gap and strengthen capacities in countries with limited resources for HTAs.

Implications for practice

Our findings indicated that the EUnetHTA Adaptation Toolkit was the most comprehensive and versatile, including separate checklists for five domains (use of technology, safety, effectiveness, economic evaluation, and organization). However, other tools might fit better depending on the type of evidence to be transferred. For instance, Alshreef et al. (Reference Alshreef, MacQuilkan and Dawkins30) omitted the EUnetHTA Adaptation Toolkit (Reference Guegan, Milne and Pordage14) because it was not specific to model adaptation. The Mullins checklist (Reference Mullins, Onwudiwe and Branco de Araújo51) was selected given it fitted best with their criteria, which included relevance to model adaptation, endorsement from a respected organization, compatibility with the International Decision Support Initiative’s reference case for economic evaluations, transparency, inclusiveness, length, and external validity. Hence, which tool to select to transfer HTAs in any setting is not straightforward and careful consideration is required.

All HTA transfers will face trade-offs between minimizing bias and maximizing precision when selecting evidence. These issues are often magnified in LMICs, which have limited access to appropriate local data, and where research from HICs is usually not as easily applicable or generalizable. Therefore, low resource settings must be pragmatic, as HTA transfers will never be perfect. Local or generalizable studies from other settings might not exist, whereas transferring and adapting applicable studies that answer the right question, even if the study population is small or quality is low, can still provide valuable insights (Reference Hawkins, Heggie, Wu, Glassman, Giedion and Smith56). There is a need to strengthen capacity and provide pragmatic guidance which discusses the trade-offs, benefits, and disadvantages of HTA transfers. Additionally, involving stakeholders is essential as they can provide expert information, ensure relevant research questions are asked, inform about transferability, and comment on the uncertainty of transferred HTAs (Reference Pinho-Gomes, Stone and Shaw7;Reference Trowman, Powers and Ollendorf8;Reference Munthe-Kaas, Nøkleby, Lewin and Glenton21).

HTA transfers are only one of the approaches which could benefit LMICs. There are various other pragmatic and rapid HTA approaches, also referred to as Adaptive HTA (aHTA), that reuse existing evidence to inform decisions, such as expedited processes, adaptation of global datasets, reuse of reviews, and price-benchmarking (Reference Nemzoff, Ruiz and Chalkidou2). These aHTA approaches require a diverse range of knowledge and skills to interpret the findings and understand uncertainty potentially added by the adaptation, or transfer, processes. There is a need for more research to understand how and when HTA transfers, as well as other aHTA approaches, are appropriate and helpful in LMICs, and what type of guidance is relevant for settings with fewer capacities, resources, and defined processes for HTA.

Strengths and limitations

The main strengths of this scoping review are the comprehensive search and up-to-date overview of tools, checklist, and approaches that can be used when considering an HTA transfer. We also addressed key issues related to definitions and missing guidance which may interfere with HTA transfers, specifically in LMIC settings.

This scoping review also has some limitations. Firstly, we focused on tools relevant to HTA analysis rather than the whole HTA process. Consequently, possible transfer tools related to “transferability” such as MCDA, deliberative processes for priority setting, appraisal, and frameworks for topic selection or stakeholder engagement that could be part of the transferability toolbox were outside of the scope of this review and excluded. Secondly, to reduce the volume of literature, the search strategy was focused on the clinical effectiveness and economic domains, and less explicitly on other HTA domains. Transferring existing evidence is complex, as it addresses many components and disciplines. The heterogeneity of the included studies reflects this complexity and made it difficult to generalize findings. We are aware that other relevant tools (e.g., country guidelines for rapid HTA methods, appraisal checklist, etc.) exist. Nevertheless, by including conference abstracts and the questionnaire to INAHTA members, we are confident that we have captured the most used transferability tools for HTA analysis currently available.

Conclusion

There is an increased interest in rapid and pragmatic HTA methods that, without using a de novo process, can answer policy questions rapidly, reduce duplication efforts, and make better use of limited resources. This scoping review identified tools from the published literature that can aid the transferability of HTA products. The EUnetHTA Adaptation Toolkit is the most comprehensive as it covers more HTA domains than the other tools. However, depending on the evidence, other checklists might be more appropriate. None of the identified tools covered all domains of an HTA, nor tackled all aspects of the transferability issues that we identified. Harmonization in HTA products and evidence-transfer processes, as well as pragmatic guidance for HTA transferability, especially for settings with limited HTA capacity, seem particularly warranted.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/S026646232200321X.

Acknowledgments

The authors would like to thank A.M. Nøstberg (Information Specialist) for supporting the development and execution of the search strategy. We also thank A.M.S. Skar for her continuous support and comments on various drafts of the article. Versions of this protocol were developed during the Systematic Review and Meta-Analysis course from the Norwegian Research School of Global Health (2020). The authors thank head-lecturer L. Larun and all other lecturers (including E.M.L. Denison, M. Johansen (Information Specialist), and J. Bidonde) for sharing their knowledge and support. All individuals are affiliated with the Norwegian Institute of Public Health.

Funding statement

This work was supported by the Norwegian Agency for Development and Cooperation (Norad). Grant number: QZA-18/0102.

Conflicts of interest

The authors declare none.

References

O’Rourke, B, Oortwijn, W, Schuller, T. The new definition of health technology assessment: A milestone in international collaboration. Int J Technol Assess Health Care. 2020;36:187190.CrossRefGoogle ScholarPubMed
Nemzoff, C, Ruiz, F, Chalkidou, K, et al. Adaptive health technology assessment to facilitate priority setting in low- and middle-income countries. BMJ Glob Health. 2021;6:e004549.CrossRefGoogle ScholarPubMed
Bijlmakers, L, Mueller, D, Kahveci, R, Chen, Y, van der Wilt, GJ. Integrate-HTA: A low-and middle-income country perspective. Int J Technol Assess Health Care. 2017;33:599604.CrossRefGoogle ScholarPubMed
Turner, S, Chase, DL, Milne, R, et al. The adaptation of health technology assessment reports: Identification of the need for, and development of, a toolkit to aid the process. Int J Technol Assess Health Care. 2009;25:2836.CrossRefGoogle ScholarPubMed
Augustovski, F, Iglesias, C, Manca, A, et al. Barriers to generalizability of health economic evaluations in latin America and the Caribbean region. Pharmacoeconomics. 2009;27:919929.CrossRefGoogle ScholarPubMed
Norwegian Institute of Public Health. Supporting implementation of HTA in low-and middle-income countries [Internet] [cited 2021 October]. 2021. Available from: https://www.fhi.no/en/qk/global-healt-collaboration/evidence-to-decisions/partnering-low-and-middle-income-countries-to-support-local-implementation–/ 31 October 2021.Google Scholar
Pinho-Gomes, A-C, Stone, J, Shaw, T, et al. Values, principles, strategies, and frameworks underlying patient and public involvement in health technology assessment and guideline development: A scoping review. Int J Technol Assess Health Care. 2022;38:e46.CrossRefGoogle ScholarPubMed
Trowman, R, Powers, A, Ollendorf, DA. Considering and communicating uncertainty in health technology assessment. Int J Technol Assess Health Care. 2021;37:e74.CrossRefGoogle ScholarPubMed
Merlin, T, Tamblyn, D, Ellery, B. What’s in a name? Developing definitions for common health technology assessment product types of the international network of agencies for health technology assessment (InaHTA). Int J Technol Assess Health Care. 2014;30:430437.CrossRefGoogle Scholar
EUnetHTA Joint Action 2 WP. HTA core model ® Version 3.0 (Pdf). 2016. Available from: www.htacoremodel.info/BrowseModel.aspx 31 October 2021.Google Scholar
Burford, B, Lewin, S, Welch, V, Rehfuess, E, Waters, E. Assessing the Applicability of findings in systematic reviews of complex interventions can enhance the utility of reviews for decision making. J Clin Epidemiol. 2013;66:12511261.CrossRefGoogle ScholarPubMed
Berg, ML, Cheung, KL, Hiligsmann, M, et al. Model-based economic evaluations in smoking cessation and their transferability to new contexts: A systematic review. Addiction. 2017;112:946967.CrossRefGoogle ScholarPubMed
Barbieri, M, Drummond, M, Rutten, F, et al. What do international pharmacoeconomic guidelines say about economic data transferability? Value Health. 2010;13:10281037.CrossRefGoogle ScholarPubMed
Guegan, E, Milne, R, Pordage, A. EUnetHTA HTA adaptation toolkit & glossary: Revised version 5. UK: EUnetHTA; 2011.Google Scholar
Arksey, H, O’Malley, L. Scoping studies: Towards a methodological framework. Int J Soc Res Methodol. 2005;8:1932.CrossRefGoogle Scholar
Peters, MDJ, Godfrey, CM, Khalil, H, et al. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc. 2015;13:141146.CrossRefGoogle ScholarPubMed
Heupink, L, Chola, L, Peacocke, E, Frønsdal, K, Sæterdal, I. Mapping of methods used for the adoption and adaptation of health technology assessments (HTA): A protocol for a scoping review. Norwegian: Norwegian Institute of Public Health; 2021.Google Scholar
Thielen, F, Van Mastrigt, G, Burgers, L, et al. How to prepare a systematic review of economic evaluations for clinical practice guidelines: Database selection and search strategy development (part 2/3). Expert Rev Pharmacoecon Outcomes Res. 2016;16:705721.CrossRefGoogle Scholar
van Mastrigt, GA, Hiligsmann, M, Arts, JJ, et al. How to prepare a systematic review of economic evaluations for informing evidence-based healthcare decisions: A five-step approach (part 1/3). Expert Rev Pharmacoecon Outcomes Res. 2016;16:116.CrossRefGoogle Scholar
Wijnen, B, Van Mastrigt, G, Redekop, W, et al. How to prepare a systematic review of economic evaluations for informing evidence-based healthcare decisions: Data extraction, risk of bias, and transferability (part 3/3). Expert Rev Pharmacoecon Outcomes Res. 2016;16:723732.CrossRefGoogle Scholar
Munthe-Kaas, H, Nøkleby, H, Lewin, S, Glenton, C. The transfer approach for assessing the transferability of systematic review findings. BMC Med Res Methodol. 2020;20:122.CrossRefGoogle ScholarPubMed
Munthe-Kaas, H, Nokleby, H, Nguyen, L. Systematic mapping of checklists for assessing transferability. Syst Rev. 2019;8:22.CrossRefGoogle ScholarPubMed
Gorry, C, McCullagh, L, Barry, M. Transferability of economic evaluations of treatments for advanced melanoma. Pharmacoeconomics. 2020;38:217231.CrossRefGoogle ScholarPubMed
Macpherson, K, Thompson, L. Experiences in adapting European network for health technology assessment rapid reviews to inform local decision making. Int J Technol Assess Health Care. 2017;33(2):155159.CrossRefGoogle Scholar
Nystrand, C, Gebreslassie, M, Ssegonja, R, Feldman, I, Sampaio, F. A systematic review of economic evaluations of public health interventions targeting alcohol, tobacco, illicit drug use and problematic gambling: Using a case study to assess transferability. Health Policy. 2020;125:5474.CrossRefGoogle ScholarPubMed
van Haalen, HG, Severens, JL, Tran-Duy, A, Boonen, A. How to select the right cost-effectiveness model?: A systematic review and stepwise approach for selecting a transferable health economic evaluation model for rheumatoid arthritis. Pharmacoeconomics. 2014;32:429442.CrossRefGoogle ScholarPubMed
Ruggeri, M, Drago, C, Rosiello, F, Orlando, V, Santori, C. Economic evaluation of treatments for migraine: An assessment of the generalizability following a systematic review. Pharmacoeconomics. 2020;38:473484.CrossRefGoogle ScholarPubMed
Ademi, Z, Tomonaga, Y, van Stiphout, J, et al. Adaptation of cost-effectiveness analyses to a single country: The case of bariatric surgery for obesity and overweight. Swiss Med Wkly. 2018;148:w14626.Google ScholarPubMed
Essers, BA, Seferina, SC, Tjan-Heijnen, VC, et al. Transferability of model-based economic evaluations: The case of trastuzumab for the adjuvant treatment of her2-positive early breast cancer in the Netherlands. Value Health. 2010;13:375380.CrossRefGoogle ScholarPubMed
Alshreef, A, MacQuilkan, K, Dawkins, B, et al. Cost-effectiveness of docetaxel and paclitaxel for adjuvant treatment of early breast cancer: Adaptation of a model-based economic evaluation from the United Kingdom to South Africa. Value Health Reg Issues. 2019;19:6574.CrossRefGoogle ScholarPubMed
Chase, D, Rosten, C, Turner, S, Hicks, N, Milne, R. Development of a toolkit and glossary to aid in the adaptation of health technology assessment (HTA) reports for use in different contexts. Health Technol Assess. 2009;13:1142, iii.CrossRefGoogle ScholarPubMed
Turner, S, Chase, DL, Milne, R, et al. The health technology assessment adaptation toolkit: Description and use. Int J Technol Assess Health Care. 2009;25:3741.CrossRefGoogle ScholarPubMed
Kristensen, FB, Lampe, K, Chase, DL, et al. Practical tools and methods for health technology assessment in Europe: Structures, methodologies, and tools developed by the European network for health technology assessment, EUnetHTA. Int J Technol Assess Health Care. 2009;25:18.CrossRefGoogle ScholarPubMed
Kidholm, K, Ekeland, AG, Jensen, LK, et al. A model for assessment of telemedicine applications: Mast. Int J Technol Assess Health Care. 2012;28:4451.CrossRefGoogle Scholar
Boulenger, S, Nixon, J, Drummond, M, et al. Can economic evaluations be made more transferable? Eur J Health Econ. 2005;6:334346.CrossRefGoogle ScholarPubMed
Nixon, J, Rice, S, Drummond, M, et al. Guidelines for completing the euronheed transferability information checklists. Eur J Health Econ. 2009;10:157165.CrossRefGoogle ScholarPubMed
Brozek, JL, Canelo-Aybar, C, Akl, EA, et al. Grade guidelines 30: The grade approach to assessing the certainty of modeled evidence-an overview in the context of health decision-making. J Clin Epidemiol. 2021;129:138150.CrossRefGoogle ScholarPubMed
Goeree, R, He, J, O’Reilly, D, et al. Transferability of health technology assessments and economic evaluations: A systematic review of approaches for assessment and application. Clinicoecon Outcomes Res. 2011;3:89104.CrossRefGoogle ScholarPubMed
Page, MJ, McKenzie, JE, Bossuyt, PM, et al. The prisma 2020 statement: An updated guideline for reporting systematic reviews. BMJ. 2021;10:89.Google ScholarPubMed
Grutters, JPC, Seferina, SC, Tjan-Heijnen, VCG, et al. Bridging trial and decision: A checklist to frame health technology assessments for resource allocation decisions. Value Health. 2011;14:777784.CrossRefGoogle ScholarPubMed
Philips, Z, Bojke, L, Sculpher, M, Claxton, K, Golder, S. Good practice guidelines for decision-analytic modelling in health technology assessment. Pharmacoeconomics. 2006;24:355371.CrossRefGoogle ScholarPubMed
Caro, JJ, Eddy, DM, Kan, H, et al. Questionnaire to assess relevance and credibility of modeling studies for informing health care decision making: An ispor-amcp-npc good practice task force report. Value Health. 2014;17:174182.CrossRefGoogle Scholar
Drummond, MF, Jefferson, T. Guidelines for authors and peer reviewers of economic submissions to the BMJ. BMJ. 1996;313:275283.CrossRefGoogle Scholar
Evers, S, Goossens, M, De Vet, H, Van Tulder, M, Ament, A. Criteria list for assessment of methodological quality of economic evaluations: Consensus on health economic criteria. Int J Technol Assess Health Care. 2005;21:240245.CrossRefGoogle ScholarPubMed
Eldessouki, R, Smith, MD. Health care system information sharing: A step toward better health globally. Value Health Reg Issues. 2012;1:118120.CrossRefGoogle ScholarPubMed
Welte, R, Feenstra, T, Jager, H, Leidl, R. A decision chart for assessing and improving the transferability of economic evaluation results between countries. Pharmacoeconomics. 2004;22:857876.CrossRefGoogle ScholarPubMed
Swedish Agency For Health Technology Assessment And Assessment Of Social Services . Appendix 7 - Checklist for assessing the quality of trial based health economic studies; and Appendix 8 - Checklist for assessing the quality of health economic modelling studies. In: Assessment of methods in health care – A handbook. Stockholm, Sweden: Swedish Agency for Health Technology Assessment and Assessment of Social Services 2018.Google Scholar
Drummond, M, Barbieri, M, Cook, J, et al. Transferability of economic evaluations across jurisdictions: ISPOR good research practices task force report. Value Health. 2009;12:409418.CrossRefGoogle ScholarPubMed
Husereau, D, Drummond, M, Augustovski, F, et al. Consolidated health economic evaluation reporting standards (CHEERS) 2022 explanation and elaboration: A report of the ISPOR CHEERS II good practices task force. Value Health. 2022;25:1031.CrossRefGoogle ScholarPubMed
Urdahl, H, Manca, A, Sculpher, MJ. Assessing generalisability in model-based economic evaluation studies. Pharmacoeconomics. 2006;24:11811197.CrossRefGoogle ScholarPubMed
Mullins, DC, Onwudiwe, NC, Branco de Araújo, GT, et al. Guidance document: Global pharmacoeconomic model adaption strategies. Value Health Reg Issues. 2014;5:713.CrossRefGoogle Scholar
Korber, K. Potential transferability of economic evaluations of programs encouraging physical activity in children and adolescents across different countries–A systematic review of the literature. Int J Environ Res Public Health [Electronic Resource]. 2014;11:1060610621.CrossRefGoogle ScholarPubMed
Perrier, L, Buja, A, Mastrangelo, G, et al. Transferability of health cost evaluation across locations in oncology: Cluster and principal component analysis as an explorative tool. BMC Health Serv Res. 2014;14:537.CrossRefGoogle ScholarPubMed
Goeree, R, Burke, N, O’Reilly, D, et al. Transferability of economic evaluations: Approaches and factors to consider when using results from one geographic area for another. Curr Med Res Opin. 2007;23:671682.CrossRefGoogle Scholar
Nemeth, B, Goettsch, W, Kristensen, FB, et al. The transferability of health technology assessment: The European perspective with focus on Central and Eastern European countries. Expert Rev Pharmacoecon Outcomes Res. 2020;20:321330.CrossRefGoogle ScholarPubMed
Hawkins, N, Heggie, R, Wu, O. Reliable sources? Generating, selecting, and applying evidence to inform the health benefits package. In: Glassman, A, Giedion, U, Smith, PC, editors. What’s in, What’s out: Designing benefits for universal health coverage. Washington: Brookings Institution Press; 2017.Google Scholar
Figure 0

Table 1. Definitions of transferability and generalizability

Figure 1

Figure 1. PRISMA flowchart of study selection process. The number of records identified per database are Ovid Medline (n = 464), Embase (n = 519), Cochrane Database of Systematic Reviews (n = 58), Scopus (n = 685), Epistemonikos (n = 304), and Cochrane Methods Methodology Register (n = 101)..

Figure 2

Figure 2. “Common» structure of HTA transfers.

Figure 3

Table 2. Tools to assess or guide transferability of economic evaluations

Figure 4

Table 3. Factors that may influence the transferability of HTAs

Supplementary material: File

Heupink et al. supplementary material

Heupink et al. supplementary material

Download Heupink et al. supplementary material(File)
File 163.5 KB