Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-18T18:13:22.107Z Has data issue: false hasContentIssue false

Toolkits for dissemination and implementation research: Preliminary development

Published online by Cambridge University Press:  27 November 2018

Ana A. Baumann*
Affiliation:
Brown School of Social Work, Washington University in St. Louis, St. Louis, MO, USA
Alexandra B. Morshed
Affiliation:
Brown School of Social Work, Washington University in St. Louis, St. Louis, MO, USA
Rachel G. Tabak
Affiliation:
Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, MO, USA
Enola K. Proctor
Affiliation:
Brown School of Social Work, Washington University in St. Louis, St. Louis, MO, USA
*
*Address for correspondence: A. A. Baumann, PhD, Brown School of Social Work, Washington University in St. Louis, 600 S. Euclid, CB 8217, St. Louis, MO 63110, USA. (Email: [email protected])
Rights & Permissions [Opens in a new window]

Abstract

The Dissemination and Implementation Research Core, a research methods core from the Clinical and Translation Science Award at Washington University in St. Louis Institute of Clinical and Translational Sciences, developed toolkits about dissemination and implementation (D&I) concepts (e.g., D&I outcomes, strategies). This paper reports on the development of the toolkits. These toolkits respond to 3 identified needs for capacity building in D&I research: resources for investigators new to the D&I field, consolidation of tools, and limitations in local contexts.

Type
Education
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-Non Commercial-NoDerivatives licence (http://creativecommons.org/licenses/by-ncnd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
© The Association for Clinical and Translational Science 2018

Introduction

The field of dissemination and implementation (D&I), which aims to accelerate research translation, is growing and emerging as a national priority [Reference Colditz and Emmons1]. D&I is a multidisciplinary science, and as such it presents the advantages and challenges of researchers of different disciplines working together [Reference Guerrero2, Reference Fort3]. The Clinical and Translational Science Award (CTSA) [4], through the National Center for Advancing Translational Science, provides an infrastructure, including cores, to support collaboration between investigators from multiple disciplines leading to innovative translational science [5]. A challenge for the cores is to provide consultation to investigators from different disciplines in a timely manner and with high quality.

Several resources for D&I research are scattered across the web or in scholarly products, such as measurement compilations [6, 7], guidance on frameworks [8], sources of information about grants and trainings [8, 9] or books [Reference Brownson, Colditz and Proctor10]. The quantity and variability of resources is a testament to the importance of D&I science for different disciplines [Reference Darnell11]. While these efforts are valuable, researchers new to the field may not know these resources exist or where to find them [Reference Morrato12]. To address the unmet educational gap of providing D&I resources for beginners, to provide tools for investigators from different disciplines interested in D&I, and to equip investigators with key D&I language and concepts, our team from the Dissemination and Implementation Research Core (DIRC), a research methods core from the CTSA at Washington University in St. Louis Institute of Clinical and Translational Sciences (ICTS), developed a set of toolkits about D&I concepts.

The ICTS supports our methods core to provide technical assistance to affiliated investigators preparing D&I research grants. We do so primarily through face-to-face meetings. As the effort to support to our core is relatively small and demand for DIRC services has steadily grown, we recognized the need for efficiency. Toolkits offered efficiency to the Core and opportunity for users to review materials at their own pace and schedule.

A “toolkit” can be defined as an action-oriented compilation of information, resources, or tools, to guide users in organizing information about a specific topic [13]. Our team uses toolkits to supplement, not replace, individual consultation. Each toolkit was led by a research assistant from the DIRC team, under supervision of the DIRC Coordinator and Principal Investigator. This article describes the first phase of development of the toolkits aimed to facilitate the knowledge of D&I science research. The topics and focus of the 8 toolkits are shown in Table 1.

Table 1 Topics and goals of the Dissemination and Implementation Research Core dissemination and implementation (D&I) toolkits, available at https://sites.wustl.edu/wudandi/di-toolkits/

Materials and Methods

Participants

Participants in this study included members of our DIRC team, D&I researchers at Washington University in St. Louis, and national experts in the D&I field. The study was reviewed and exempted by the Washington University in St. Louis Human Research Protections Office (Institutional Review Board no. 201508175). The D&I experts that provided feedback on the toolkits received a $50 gift certificate from Amazon.

Theoretical Framework

Our team relied on educational design research methodology to develop the toolkits. The function of educational design research is to develop solutions to complex educational problems through “systematic study of designing, developing, and evaluating educational interventions” [14]. This systematic and iterative methodology conducts assessments of adaptations to the tools and interventions, providing details regarding the process that allows for replication by others [14Reference Richey and Klein16]. We selected the educational design framework, rather than other models that could provide guidance to this process (e.g., user-centered design), because it allowed us to create tools to support our consultation process—which often involves an education component of defining D&I and how D&I can support our customers’ work—while also advancing the knowledge on tool development [14Reference Richey and Klein16].

Fig. 1 illustrates this study’s evaluation process, adapted from McKenney’s [17] model and based on educational design research methodology [1417]. The process is cyclical and consists of 3 phases. Each cycle represents a small cycle of research, with reflection and documentation taking place during all cycles [13]. Each phase has a separate focus and emphasizes different evaluation criteria [13]. The first phase (preliminary research and initial development) consists of analysis of the educational problem and its context and leads to the development of toolkit prototypes and plans for their implementation. The second phase (development through small-scale evaluation) iteratively evaluates the toolkits with emphasis on content validity and consistency. Finally, in the summative evaluation (not shown in the table), the toolkits are evaluated based on their use as intended. Because the D&I field is constantly evolving, we consider that Phases I and II can be in constant iteration, where after the initial development and preliminary research, the toolkits would be updated after a certain period of time has passed. This study describes the details from Phase I and Phase II of the development of eight D&I toolkits.

Fig. 1 Cyclical evaluation process of the Dissemination and Implementation Research Core dissemination and implementation toolkits (Adapted from McKenney [17]).

Toolkit Reach

We used Google Analytics to examine the number of visitors (new and returning) to the page that hosts the toolkits, and their location.

Results

Phase I: Preliminary Research and Initial Development

Preliminary Research

Table 2 details an overview of the steps taken for the development of the DIRC toolkits. To complete the first phase—analysis of the educational problem and its context—we characterized the likely users, their research experience, current D&I expertise, training preferences, and organizational needs. Discussions among the DIRC leadership led to the decision that the first target audience for toolkits would be the DIRC customers. They range widely in substantive areas (e.g., psychologists, social workers, nurses, cardiologists, pediatricians), career (e.g., assistant, associate and full professors) as well as D&I expertise. We next reviewed existing online D&I materials and tools to identify gaps. Although we found several resources, most were scattered across the web and targeted to researchers with mid-level to high level of knowledge of D&I. Our team concluded that missing in the field was a compilation of resources for researchers with beginner level knowledge in D&I.

Table 2 Overview of the development of the DIRC toolkits

D&I, dissemination and implementation; DIRC, Dissemination and Implementation Research Core; ICTS, Institute of Clinical Translation Science; WU, Washington University in St. Louis; WUNDIR, the Washington University Network of Dissemination and Implementation Researchers.

Phase II: Development of Toolkits

Phase II of the evaluation consisted of several evaluation cycles in which we assessed and revised the toolkits for content validity and consistency [13] and expected practicality and effectiveness. The toolkits underwent 3 main evaluations. To ensure initial content validity, we conducted a systematic review of scientific literature relevant to each toolkit. Next, to ensure initial consistency, we revised each toolkit using a checklist of toolkit characteristics and design specifications, adapted from AHRQ’s Toolkit Content and Usability Checklists [Reference Morrato12]. As a result, we added a description of objectives and short overview of the toolkit to the first page of each toolkit. Finally, each toolkit was evaluated by 2 national experts in the D&I field, selected from our network of collaborators. Experts were contacted by email and asked to review the content of the toolkit. The toolkit was attached as a PDF and survey responses were collected in Qualtrics. The survey included quantitative items in Likert scale (e.g., “To what extent do you think this toolkit adequately captures the necessary elements to achieve its objectives?”) as well as open-ended questions (e.g., “Which components of the toolkit would benefit from further development in order to enhance the toolkit’s usefulness for D&I beginners?”).

Table 3 shows the results of the quantitative questions. With regard to inclusion of key information, the evaluations varied considerably. The introduction to D&I toolkit was considered complete, while the other toolkits needed to be rearranged, updated, or expanded. However, overall experts thought the toolkits would be very useful for someone in the D&I field, that the toolkits had reasonable goals, and that they would recommend the toolkit to others.

Table 3 Results of the evaluation of the Dissemination and Implementation Research Core toolkits

Scale of 10–100, with 10 being “completely disagree” and 100 being “completely agree.” D&I, dissemination and implementation.

Experts were asked “Overall, do you think that this toolkit is a good resource for (check all that apply): D&I beginners, D&I intermediate, D&I experts”. Most of the toolkits were considered a good fit for all 3 levels, with the exception of the Strategies (intermediate) and Designs toolkits (expert and intermediate). Evaluators suggested adding more examples in order to make the toolkits more useful for beginners in the field. For example:

For intermediates and experts really, I think it might be helpful to include more examples from the literature of each type of design. For myself, I really like to see examples of these designs in action, and with the number of protocol papers being published, it seems that there are a lot to choose from. Perhaps a bibliography link for each type with multiple examples?”

This is a useful resource. It would be strengthened by more information at the beginning about when in the proposal process to think about these strategies, and more on how to select strategies that align with certain interventions or other needs. If you wanted to really go to town, you could give more examples in the appendix of how specific strategies have been used, when there are more available. It would also be helpful to see a reference for what research has been done on the effectiveness of each strategy, or note when there is no such evidence.”

One evaluator stated, “I like the idea and would use this in my teaching and mentoring.”

Toolkit Editing

The team gathered the feedback from the expert evaluators and edited the toolkits accordingly. Edits included adding examples, rearranging the flow of information, editing the decision trees, updating, and adding additional resources. Some feedback was conflicting; for example, the Strategies toolkit was considered to be too advanced for a D&I beginner investigator by one expert, and perfect for a beginner by another expert. In this case, the team decided not to make major changes pending further use and feedback.

Toolkit Reach

The toolkits are currently available online at https://sites.wustl.edu/wudandi/, a page that hosts information about the D&I work at Washington University in St. Louis in general. In 2017, the page had more than 1000 views, with about 290 new users and more than 300 returning users (data from Google Analytics). The majority of the visitors were from the United States but a number of them are from Australia (n=10) and other countries such as United Kingdom and India (n=4), Canada, Guatemala, and Ireland (n=3) showing a potential international reach of our materials. As our local ICTS investigators were the primary target, about 40% of the visitors were from Missouri.

Discussion

Although the rapid growth of the D&I science field has triggered high demand for capacity building [Reference Tabak18], demand for research training far outstrips the supply of slots in current programs [Reference Proctor and Chambers19]. This project provides insight into how toolkits with key D&I concepts can be developed to support investigators moving into the field. These toolkits respond to 3 identified needs for capacity building in D&I research: resources for early-stage investigators, consolidation of tools, and limitations in local contexts [Reference Chambers20]. While the D&I literature has expanded rapidly, many resources are geared to those with intermediate or advanced knowledge. Moreover, most are scattered across the literature or internet. While more introductory resources have been developed, there are calls to consolidate resources in order to lower barriers to entry for those new to the field [Reference Chambers20].

Chambers et al. [Reference Chambers20] note that a challenge in meeting the needs of D&I research trainees is limited support and local capacity. Resources that are available virtually and in real time may help address this challenge. Even in an environment such as ours that is relatively rich in D&I resources, including an ICTS core dedicated to providing technical assistance [Reference Brownson21], our DIRC team is often stretched to meet demand for service. For example, the number of customers served has grown from 11 in 2009 to 65 in 2017. The toolkits have expanded our team’s capacity by providing resources investigators can use before, in conjunction with, or following one-on-one consultation and have proven useful to our Center for D&I Research’s annual D&I grant bootcamp [Reference Brownson21]. While they have helped us support our customers, the rapid pace of the D&I field requires the toolkits be periodically updated. Our current procedure includes review and update of all toolkits every 6 months.

This is the first step in the development of the toolkits and is limited by its preliminary nature and by the involvement of a relatively small number (n=2) of stakeholders. The toolkits were developed in response to needs directly experienced in one CTSA. However, the project relied on national experts in D&I for toolkit development and followed a systematic approach to iterative product development and refinement.

Future Directions

The toolkits are currently being updated by our team and will be evaluated in Phase III [13]. Our goal is to involve investigators affiliated with CTSAs nationwide to engage a broader, more diverse audience of users and to test the usability and D&I knowledge gained among CTSA customers. We will also engage D&I experts from other CTSAs to develop new toolkits, including advanced and specialized topics. Possible topics for new toolkits are adaptation in D&I science, deimplementation, and D&I planning early in intervention development. We will also work with other CTSAs to make some toolkits interactive, such as the ones developed by other D&I Centers [22, 23]. These efforts will be evaluated and reported so others can replicate and add to the process.

Acknowledgments

The authors thank all the research assistants and colleagues who developed the toolkits: Beth Prusaczyk, Donny Gerke, Ericka Lewis, Emily Kryzer, Sara Malone, Karen Lawrence, Alex Ramsey, and Irene Taranhike. The authors also thank the senior scholars who evaluated our toolkits and the Washington University Network of Dissemination and Implementation Researchers (WUNDIR) group for their invaluable feedback and support. The authors also thank the DIRC customers for their feedback.

Financial Support

This work was supported by CTSA Grant UL1 TR002345. A.A.B. is funded by 3U01HL133994-02S1 and R01HG009351; A.B.M. is funded by 1T32HL130357 from NHLBI; R.G.T. and E.K.P. are funded by P30DK092950 from the NIDDK; E.K.P. is also funded by 1R25CA171994; and A.A.B. and E.K.P. are also funded by 2R25MH0809. The contents of this manuscript are solely the responsibility of the authors and do not necessarily represent the official views of the NIH.

Author Contributions

A.A.B. drafted the manuscript; A.A.B. and E.K.P. supervised the development of the toolkits; A.B.M. and R.G.T. developed toolkits and supported the writing of this manuscript. All authors reviewed and approved this version of the manuscript.

Disclosures

The authors have no conflicts of interest to declare.

Supplementary materials

To view supplementary material for this article, please visit https://doi.org/10.1017/cts.2018.316

References

1. Colditz, GA, Emmons, KM. The promise and challenges of dissemination and implementation research. In Brownson RC, Colditz GA, Proctor EK, eds. Dissemination and Implementation Research in Health, 2nd edition. New York: Oxford University Press, 2017: 118.Google Scholar
2. Guerrero, EG, et al. Interdisciplinary dissemination and implementation research to advance translational science: challenges and opportunities. Journal of Clinical and Translational Science 2017; 1: 6772.Google Scholar
3. Fort, DG, et al. Mapping the evolving definitions of translational research. Journal of Clinical and Translational Science 2017; 1: 6066.Google Scholar
4. Institute of Medicine. The CTSA Program at NIH: Opportunities for Advancing Clinical and Translational Research. Washington, DC: The National Academies Press, 2013.Google Scholar
5. National Center for Advancing Translational Sciences. About the CTSA program [Internet], 2017 [cited Aug 10, 2017]. (https://ncats.nih.gov/ctsa/about)Google Scholar
6. Society for Implementation Research Collaboration. [Internet], 2017. [cited Aug 10, 2017]. (https://societyforimplementationresearchcollaboration.org/)Google Scholar
7. Grid Enabled Measures Database. Dissemination and implementation initiative [Internet], 2012 [cited Aug 8, 2017]. (https://www.gem-beta.org/public/wsoverview.aspx?cat=8&wid=11&aid=0)Google Scholar
8. Dissemination and Implementation Models in Health Research and Practice [Internet] [cited Aug 10, 2017]. (http://dissemination-implementation.org/)Google Scholar
9. National Cancer Institute. Examples of funded grants in implementation science [Internet], 2017 [cited Jan 30, 2018]. (https://cancercontrol.cancer.gov/IS/sample-grant-applications.html)Google Scholar
10. Brownson, R, Colditz, GA, Proctor, E. editors. Dissemination and Implementation Research in Health: Translating Science to Practice, 2nd edition. New York: Oxford University Press, 2017.Google Scholar
11. Darnell, D, et al. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implementation Science 2017; 12: 137.Google Scholar
12. Morrato, EH, et al. Dissemination and implementation of comparative effectiveness evidence: key informant interviews with Clinical and Translational Science Award institutions. Journal of Comparative Effectiveness Research 2013; 2: 185194.Google Scholar
13. Agency for Healthcare Research and Quality. AHRQ publishing and communications guidelines. Section 6: toolkit guidance [Internet], 2016 [cited Aug 10, 2017]. (http://www.ahrq.gov/research/publications/pubcomguide/pcguide6.html)Google Scholar
14. van den Akker J, et al . An introduction to educational design research. Proceedings of the seminar conducted at the East China Normal University, Shanghai, PR China, November 23–26, 2007 [Internet], 2010 [cited June 2, 2018]. (http://www.slo.nl/downloads/2009/Introduction_20to_20education_20design_20research.pdf/download)Google Scholar
15. Richey, RC, Klein, JD. Design and Development Research. In Spector JM, Merrill MD, Elen J, Bishop MJ, eds. Handbook of Research on Educational Communications and Technology. New York, NY: Springer, 2014: 141150.Google Scholar
16. Richey, RC, Klein, JD. Design and Development Research: Methods, Strategies, and Issues. New York, NY: Routledge/Taylor & Francis Group, 2007.Google Scholar
17. McKenney SE. Computer-Based Support for Science Education Materials Developers in Africa: Exploring Potentials, [Internet] 2001 [cited May 2, 2017]. (https://research.utwente.nl/en/publications/computer-based-support-for-science-education-materials-developers)Google Scholar
18. Tabak, RG, et al. Dissemination and implementation science training needs: insights from practitioners and researchers. American Journal of Prevention Medicine 2017; 52: S322S329.Google Scholar
19. Proctor, EK, Chambers, DA. Training in dissemination and implementation research: a field-wide perspective. Translational Behavioral Medicine 2016; 3: 624635.Google Scholar
20. Chambers, DA, et al. Mapping training needs for dissemination and implementation research: lessons from a synthesis of existing D&I research training programs. Translational Behavioral Medicine 2017; 7: 593601.Google Scholar
21. Brownson, RC, et al. Building capacity for dissemination and implementation research: one university’s experience. Implementation Science 2017; 12: 104.Google Scholar
22. University of Colorado. Dissemination and Implementation Program. Dissemination & implementation: tips for getting funded [Internet], 2018 [cited Jan 2018]. (http://www.crispebooks.org/DIFundingTips)Google Scholar
23. The Center for Research in Implementation Science and Prevention. Dissemination and implementation models in health research & practice [Internet] [cited Jan 2018]. (http://www.dissemination-implementation.org/select.aspx)Google Scholar
Figure 0

Table 1 Topics and goals of the Dissemination and Implementation Research Core dissemination and implementation (D&I) toolkits, available at https://sites.wustl.edu/wudandi/di-toolkits/

Figure 1

Fig. 1 Cyclical evaluation process of the Dissemination and Implementation Research Core dissemination and implementation toolkits (Adapted from McKenney [17]).

Figure 2

Table 2 Overview of the development of the DIRC toolkits

Figure 3

Table 3 Results of the evaluation of the Dissemination and Implementation Research Core toolkits

Supplementary material: File

Baumann et al. supplementary material

Baumann et al. supplementary material 1

Download Baumann et al. supplementary material(File)
File 44.8 KB