Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-25T07:40:24.725Z Has data issue: false hasContentIssue false

Development and evaluation of a novel training program to build study staff skills in equitable and inclusive engagement, recruitment, and retention of clinical research participants

Published online by Cambridge University Press:  30 August 2022

Jessica R. Cranfill*
Affiliation:
Duke Office of Clinical Research, Duke University, Durham, NC, USA School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA
Stephanie A. Freel
Affiliation:
Duke Office of Clinical Research, Duke University, Durham, NC, USA School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA
Christine E. Deeter
Affiliation:
Duke Office of Clinical Research, Duke University, Durham, NC, USA School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA
Denise C. Snyder
Affiliation:
Duke Office of Clinical Research, Duke University, Durham, NC, USA School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA
Susanna Naggie
Affiliation:
Duke Office of Clinical Research, Duke University, Durham, NC, USA School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA
Nadine J. Barrett
Affiliation:
School of Medicine, Duke University, Durham, NC, USA Clinical and Translational Science Institute, Duke University, Durham, NC, USA Duke Cancer Institute, Duke University, Durham, NC, USA
Jamie N. Roberts
Affiliation:
Clinical and Translational Science Institute, Duke University, Durham, NC, USA Duke Cancer Institute, Duke University, Durham, NC, USA
*
Address for correspondence: J. R. Cranfill, MA, Duke University, 2200 West Main Street, Durham, NC 27705, USA. Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Background:

Adequate equitable recruitment of underrepresented groups in clinical research and trials is a national problem and remains a daunting challenge to translating research discoveries into effective healthcare practices. Engagement, recruitment, and retention (ER&R) training programs for Clinical Research Professionals (CRPs) often focus on policies and regulations. Although some training on the importance of diversity and inclusion in clinical research participation has recently been developed, there remains a need for training that couples critical equity, diversity, and inclusion (EDI) concepts with skill development in effective recruitment and retention strategies, regulations, and best practices.

Approach and methods:

We developed the ER&R Certificate program as a holistic approach to provide Duke University CRPs the opportunity to build competency in gap areas and to increase comfort in championing equitable partnerships with clinical research participants. The thirteen core and elective courses include blended learning elements, such as e-learning and wiki journaling prompts, to facilitate meaningful discussions. Pre- and post-assessments administered to CRP program participants and their managers assessed program impact on CRP skills in ER&R tasks and comfort in equitable, diverse, and inclusive engagement of clinical research participants.

Results and discussion:

Results from the first two cohorts indicate that CRPs perceived growth in their own comfort with program learning objectives, especially those centered on participant partnership and EDI principles, and most managers witnessed growth in competence and responsibility for ER&R-related tasks. Results suggest value in offering CRPs robust training programs that integrate EDI and ER&R training.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Background

The success of clinical research in improving public health depends on robust engagement, recruitment, and retention (ER&R) of participants that meet sample size requirements and represent the diversity of the population. Meeting enrollment goals in general can be challenging. As many as 19% of registered clinical trials are stopped early due to failed accrual [Reference Carlisle, Kimmelman, Ramsay and MacKinnon1], and as many as 86% do not achieve their accrual goals within their target timelines [Reference Treweek, Lockhart, Pitkethly, Cook, Kjeldstrom and Johansen2Reference Sullivan4]. Within the context of health equity, this issue becomes even more salient as lack of participation among underrepresented race and ethnic groups compromises study outcomes and generalizability and can widen the gap in health disparities [Reference Durant, Wenzel and Scarinci5Reference Barrett, Rodriguez and Iachan8]. For example, underrepresented race and ethnic groups make up 36% of the US population and only account for < 12% of clinical research participants [9]. This critical negative impact is felt even more keenly when those carrying the greatest burden of a disease are not proportionately represented among research participants.

As Rodrigues-Torres et al have described, enrollment challenges fall into four general factor categories: study-related, participant-related, study team-related, and system-related [Reference Rodriguez-Torres, Gonzalez-Perez and Diaz-Perez10]. Therefore, improving recruitment in clinical research will require multiple complex and multifactorial approaches. Thoughtfulness into systemic and individual factors, including stereotypes, systemic racism, and bias, and their influence on equitable recruitment of clinical research participants can significantly improve the overall health impact of interventions aiming to improve enrollment and retention. One key opportunity for intervention is training for the staff who engage research participants. Such training should build skills in specific areas such as Equity, Diversity, and Inclusion (EDI) and implicit bias, cultural humility, community engagement and outreach, and tailored communication that is inclusive, raises awareness, enhances trust, and incorporates perspectives of all potential participants [Reference Barrett, Ingraham, Vann Hawkins and Moorman11].

Trainings on recruitment and retention competencies for Clinical Research Professionals (CRPs) are available [1214]. However, many available offerings focus on rules, regulations, and policies regarding recruitment and informed consent. Recently, specific training programs have been created to address equity and diversity in clinical research participation [Reference Kusnoor, Villalta-Gil and Michaels15Reference Anwuri, Hall and Mathews18]. These have demonstrated success in highlighting the necessity for enrolling underrepresented populations, although evidence of effectiveness to-date is somewhat limited [Reference Delaney, Devane and Hunter19,Reference Quinn, Jimenez and Meade20]. Importantly, training programs for CRPs have not integrated regulatory and practical knowledge development with engagement and EDI principles, much less their impact on receuitment practices. Here we describe the development and implementation of a training program for CRPs that combines these critically interrelated concepts, which holistically we refer to as ER&R.

One aim of the Network Capacity Hub of Duke’s current Clinical and Translational Science Award (CTSA) is to provide recruitment training with an equity lens to investigators and staff. As a part of this aim, our curriculum is designed to build deeper skills in CRPs from across the Duke University Schools of Medicine and Nursing, including areas identified as critical educational gaps: personal internal biases and mitigation methods; knowledge of social marketing principles and their applicability to clinical research participation; attention to readability and health literacy needs; the value of adopting participant perspectives and building cultural humility; the importance of trust, trustworthiness and partnerships; sufficient budgeting for outreach; and community and stakeholder engagement [Reference Sullivan4,Reference Barrett, Ingraham, Vann Hawkins and Moorman11].

The Duke ER&R Certificate Program is a training and skills-building program designed for staff-level CRPs, such as Clinical Research Coordinators (CRCs). The program’s purpose is to develop and expand ER&R competencies and to provide the tools and confidence necessary for staff to take proactive steps toward more inclusive ER&R practices in research conducted at Duke. This paper describes our efforts to develop, implement, and evaluate a blended instruction certificate program at Duke. We assess two cohorts of CRPs who have completed the certificate program.

Methods

We began by establishing an interdisciplinary Steering Committee whose members planned, developed, implemented, and evaluated the ER&R program. Work was supported by both The Duke Clinical and Translational Science Institute (CTSI) and the Duke Office of Clinical Research (DOCR) and was funded by the Duke CTSA grant and the Duke University School of Medicine. The steering committee consisted of an expert (JR) in ER&R from the CTSI Recruitment Innovation Center (RIC), an expert (NJB) in health equity and inclusive research from the CTSI Equity in Research Core, an expert (JRC) in adult learning and instructional design from DOCR, and an expert (SAF) in workforce development and training from DOCR. We applied the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) [Reference Branson, Rayner, Cox, Furman, King and Hannum21] instructional design model, which has been shown to produce effective training programs [Reference Branson, Rayner, Cox, Furman, King and Hannum21,Reference Stavredes and Herder22].

Program implementation took place over a period of 12 months, February 2020 to February 2021, as illustrated in Fig. 1. The goals for course and program design were to exemplify best practices in adult learning, incorporate EDI themes and application in each course, and prepare for program sustainability and ultimate sharing with other research institutions.

Fig. 1. Program implementation and launch timeline for the first cohort (C1) and second cohort (C2) of the Engagement, Recruitment, and Retention program.

Analysis Phase

We developed initial program goals and course objectives based on CRP training needs by engaging the Duke research community through a variety of leadership and CRP networks. The RIC team collated a list of potential training topics based on experience, a needs assessment survey, literature reviews, and consultations with the Duke research community. The steering committee identified and collaborated with subject matter experts (SME) from across the enterprise, gathering feedback on potential course content and program goals. Based on known challenges and literature cited above, the steering committee and SMEs agreed that a foundational focus of the program courses must be on EDI representation in research. From this Analysis phase, the following 4 key components were defined.

Key Program Components

Expert instruction

Experts across Duke University helped develop the course objectives and content for their area of expertise, facilitate their course, and serve as resources for each topic. An expert in EDI (NJB) ensured facilitators were equipped with the tools, strategies, and framework needed to confidently shape and deliver content for each individual course with an equity lens.

Blended instruction design

The program includes e-learning, synchronous course sessions, and flipped classroom elements. Participants engage in meaningful discussion with facilitators and their peers after completing pre-learning materials.

Continuous engagement

Engaging course materials support ongoing learning, critical thinking, and application of concepts and skills throughout the program. The blended approach includes post-class journaling and continuous resource sharing among participants in a dedicated cohort Wiki space for each session.

Career development

Participants who complete the program receive a certificate that can be included in their portfolio for Duke CRP Tier Advancement opportunities [Reference Deeter, Hannah and Reyes23].

Design & Development Phases

The ER&R program consists of a series of 1- to 2-hour sessions, beginning with the Just Ask: Equity and Diversity in Clinical Research (NJB) course. Just Ask establishes an initial EDI mindset and provides participants with a foundation to begin identifying their own internal biases and ways to foster inclusive participant partnerships. The Just Ask instructor, along with other EDI experts from across Duke, contributed to the design of each individual course by guiding the development of learning objectives and framing course discussions and activities with an EDI lens. A full list of the courses and objectives for the first and second cohorts are outlined in Table 1. Those objectives that include course content related to EDI are indicated with a single asterisk. Additionally, course descriptions are publicly available on the ER&R program webpage.

Table 1. Core (C) and elective (E) courses and learning objectives (O#). The course title column lists program courses, blended learning elements, and any changes from cohort 1 to cohort 2. Double asterisks (**) in the course title column indicate changes or elements that were added for cohort 2 based on feedback and expansion of blended program design. The learning objectives column lists the objectives for each course. Objectives in bold were included in both cohort 1 and cohort 2. Objectives beginning with a single asterisk (*) are related to the program equity, diversity, and inclusion (EDI) lens given the content covered.

Throughout implementation, the steering committee gathered feedback from participants and facilitators, making concomitant changes to the program structure and content. Notable changes to program structure occurred between the first cohort (C1) and second cohort (C2), including 1) requiring three electives rather than two; 2) splitting E1: Community, Stakeholder, and Patient Engagement into two separate courses; 3) adding five new online pre-learning modules; and 4) making C7: Building Trust and Partnerships a required core course rather than an elective. A full list of current program objectives is available in Supplemental Materials (A).

Facilitator EDI Retreat

We recruited an interdisciplinary group of 21 facilitators from across Duke University. Prior to program launch, we hosted an all-facilitator retreat to ensure alignment with the EDI framework. The objective of the retreat was to discuss 1) how equity, diversity, and inclusion are integrated into each course’s topics and objectives; 2) the goal of the overall program; 3) how to address questions and facilitate discussions around EDI; and 4) appropriate language to use when discussing sensitive topics.

To further ensure the incorporation of intentional content tying in EDI principles and practices to each individual course concept, the steering committee was involved in the design of each course’s learning plan, subsequent materials, and activities. Learning plans for each course outlined: 1) content experts/facilitators, 2) method of instruction, 3) pre-learning requirements, 4) course description and objectives, and 5) post-class journaling prompts. Facilitator Guidebooks were created for each course to describe learning objectives, activities, and discussion prompts. This content organization strategy was used to ensure program sustainability and enable eventual sharing across institutions. Two example Course Learning Plans (B) and one Course Facilitator Guide (C) are available in the Supplemental Materials.

Blended Learning Design

As described above, each course includes blended learning elements. The various pre-learning materials for the program include videos, journal articles, website reviews, and e-learning modules. The intention of most pre-learning experiences is to provide learners with an initial introduction to the course content, enabling them to attend the corresponding session with a foundational understanding of the topic. Other pre-learning materials prepare the participants for a specific activity or discussion that would occur during a session.

Both cohorts received an online introductory pre-learning module to supplement Just Ask: Equity and Diversity in Clinical Research. As of C2, six additional program courses include e-learning modules created (JRC) using the Articulate 360 Storyline and Rise development tools. To enhance information processing and recall, each module includes some combination of reading, visual and verbal elements (video, narration, and animation), interactive engagement, and assessment or practice [Reference Stavredes and Herder22,Reference Mayer and Mayer24].

A program Wiki provides a learning hub for materials and resource sharing [Reference Lombardozzi25]. Participants can access resources and share how they have used the various strategies they learned. The Wiki remains available after program completion for CRPs to review course materials and to share with colleagues. The Wiki houses the following for each cohort: 1) course title and description for each session, 2) resources (reading materials, videos, slides, web links, etc.), 3) pre-learning requirements including e-learning links, 4) post-class journaling prompts, 5) course evaluation link, 6) discussion space for comments and idea sharing. The post-class journaling prompts are intentionally designed to encourage program participants to think about the content covered with an inclusivity lens. With these prompts, we ask the program participants to reflect on various ways the content ties back to EDI recruitment and engagement of clinical research participants. Journaling prompt examples are included in the two Learning Plan samples in Supplemental Materials B. Course facilitators and members of the Steering Committee monitor ongoing conversations in the Wiki, providing additional information for consideration and thoughtful discourse.

Implementation Phase

Initial implementation of the ER&R program occurred over a period of 12 months. Important steps to ensure successful implementation included gathering buy-in from Clinical Research Unit (CRU) managers to ensure their support of staff attending the program and marketing the program to the CRP community. This occurred via presentations and announcements provided first to CRU leadership and then to the full CRP community.

As shown in Fig. 1, the initial nomination period opened on July 6, 2020. CRU leadership and managers were encouraged to nominate CRPs who perform recruitment and retention functions. Nominations were closed one week earlier than anticipated due to overwhelming response. The steering committee reviewed all nominee applications and expected nominees to have sufficient experience to return to their units as ER&R mentors upon program completion. The program used a web-based data collection tool, REDCap, to house and track all forms and associated alerts for nomination, acceptance, elective selection, and evaluation [Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde26].

Participants are required to complete one online pre-requisite, six core courses and attend at least three of seven elective courses to receive a certificate. Originally planned as in-person classes, the COVID-19 pandemic required a pivot to virtual sessions via Zoom. The first three cohorts were held biannually due to high demand for participation. Starting in 2022, Duke is offering the program once per year. Enrollment and completion tracking occur via the Duke Learning Management System (for e-learning and session attendance) and REDCap (for other elements) [Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde26].

Evaluation Phase

Evaluations address the first three levels of the Kirkpatrick model of evaluation [Reference JR, Van Buren and Martin27]: reaction, learning, and behavior. First, participants complete course evaluations following each session to rate success of each course at achieving learning objectives. Second, before and after program completion, participants complete identical self-assessments addressing their comfort with performing the session objectives for all core courses and their chosen electives on a six-point scale from very uncomfortable (1) to very comfortable (6). The self-assessments meet Kirkpatrick evaluation Level 2 and focus on self-perceived learning achieved during the program [Reference JR, Van Buren and Martin27]. Third, managers complete assessments (Fig. 2 and Supplemental Materials D and E) before program start and six months after program completion to assess their employee’s level of competency in recruitment and retention-related tasks. These manager-scored assessments are existing validated tools used at Duke to measure competency achievement based on the JTFCR defined recruitment and retention competencies toward Tier Advancement [Reference Deeter, Hannah and Reyes23]. This assessment meets Kirkpatrick evaluation Level 3, behavior [Reference JR, Van Buren and Martin27].

Fig. 2. Recruitment and Retention tasks addressed in manager pre- and post-assessments of task responsibility.

We developed an interview guide (Supplemental Materials F) to collect additional information about participant experiences and takeaways from the program. Six students (3 from C1 and 3 from C2) participated in these interviews with an independent interviewer. This was a first step toward understanding participants’ perceptions of the program’s ability to promote integration of EDI principles into ER&R behavior.

Results

C1 received 73 nominees within just a few days of opening the nomination process. With a planned cap of approximately 30 participants, we asked nominators from each CRU to prioritize two individuals for C1 and defer others to C2. All deferred individuals were guaranteed the opportunity to participate in C2. C1 graduated 32 CRPs, C2 graduated 25, and the third cohort graduated 23. Evaluation of the third and most recent cohort is still in process.

Program Participants

A total of 59 CRPs, representing 18/23 Duke CRUs participated in C1 and C2, including three visiting participants from the University of North Carolina at Chapel Hill (UNC-CH) and one from Durham Technical Community College (DTCC), with a retention rate of 93% over the 6-month program. Table 2 displays information regarding program participant job titles, demographics, roles, and tenure in their position. Notably, almost 30% of program participants have worked in clinical research for seven or more years, indicating a perceived need for this type of training even for more seasoned staff. Program participants discussed the need for ongoing efforts to diversify the CRP workforce nationally, while ensuring access to training that incorporates an EDI lens to their work. It is encouraging that a relatively diverse group of CRPs were interested in the program and able to engage in rich discussions from a variety of perspectives.

Table 2. Participant information captured at nomination from cohort 1 and cohort 2. The table displays job title, length of time in clinical research, percentage of effort (time) spent on recruitment and retention tasks, and demographic information captured from 55 of 59 students, including race, ethnicity, sex, and age.

Self-Assessments

Self-assessments measured participant comfort with the course objectives listed in Table 1. Due to some changes in course offerings and objectivesFootnote 1 between C1 and C2, we analyzed objectives separately for C1 and C2 and then analyzed overlapping objectives between cohorts. For both C1 and C2, average comfort level grew from pre to post for every course objective. The average increase in comfort for each objective ranged from 7.6% to 42.4% (C1) and 8.5% to 52.6% (C2). The full data table is available in the Supplemental Materials (G).

The courses for C1 and C2 included 63 and 65 total learning objectives, respectively. Table 3 displays the ten objectives for each cohort with the highest increase in pre- to post-levels of comfort and indicates which objectives are directly related to EDI given the content covered. Five of the ten objectives with the greatest increase for C1 were in E1: Community and Stakeholder Engagement. Other notable growth for C1 (3 of top 10) occurred in the objectives for E5: Shoestring Budget. For C2, six of the top ten objectives were in E4: Community-Engaged Research Initiatives (2) and E8: Stakeholder Engagement (4) courses. Another notable increase in comfort for C2 occurred in the C5: Social Marketing (3 of 10).

Table 3. The 10 learning objectives with the highest percent increase in comfort according to cohort 1 and cohort 2 self-assessments. Bolded rows reflect community and stakeholder engagement-related objectives. Objectives beginning with a single asterisk (*) are related to the program’s equity, diversity, and inclusion (EDI) lens given the content covered

As indicated in Table 1, over half of all individual course objectives (35/63 for C1 and 39/65 for C2) are related to EDI given the content covered in class. For C2, all but three EDI-related objectives had an average rating of 5 (comfortable) or higher after program completion. Notably, the C2 objectives with an average rating of less than 5 post-program were still rated high, between 4.6 and 4.9, and fall into the top 10 objectives with the most average growth from pre to post in Table 3. Similarly, for C1, only 4/35 EDI-related objectives had an average comfort rating of less than 5 (4.6–4.9) post-program. As shown in Table 3, two of these C1 objectives were the objectives with the most growth from pre to post for C1. Those corresponding C1 courses (E3: Telling the Story, E6: Remote Consent, E1: Community and Stakeholder Engagement) were updated significantly for C2.a

Fig. 3 shows mean comfort level pre- and post-program from 1 (very uncomfortable) to 6 (very comfortable) for course objectives that overlapped (stayed the same) between C1 and C2. This figure does not include data from the community-engaged research or stakeholder engagement-related courses, which had significantly different course offerings and objectives between cohorts.

Fig. 3. Average comfort level (very uncomfortable (1) to very comfortable (6)) with overlapping course objectives before and after program completion. Table 1 outlines the specific course objectives that map to the codes in this figure (e.g. C1-01 = Core 1, Objective 1 and E2-01 = Elective 2, Objective 1). A single asterisk (*) indicates courses where an additional e-learning module was added as pre-learning prior to holding the course for Cohort 2 (C2).

For 39 (79.6%) of the 49 overlapping learning objectives, C2 showed more growth than C1 in average comfort level from pre to post. The higher comfort post-program and/or greater growth for C2 might be attributed to the addition of more blended elements for the second cohort, including creation of online pre-learning modules and additional discussion time in class. Every course that added a blended learning element for C2 showed greater overall comfort growth over C1. This difference between cohorts could also be attributed to the selection process for C1 and C2. C1 participants were hand selected from a list of 74 nominees by leadership in each unit as most likely to return to the unit as a resource upon program completion, possibly indicating a higher overall baseline comfort level.

For both cohorts, the objectives with the lowest pre-program comfort ratings (3 to 3.6 for C1 and 2.8 to 3.3 for C2) were in the E5: ER&R on a Shoestring Budget, E2: Social Media, and C5: Social Marketing courses. The objectives with the highest pre-program comfort (4.6 to 5.2 for C1 and 4.5 to 5.1 for C2) were in the C7: Building Trust and C6: Active Listening courses. There were notable differences between C1 and C2 percent increases in average comfort level for E3: Telling the Story and E6: eConsent Design, with C2 being less comfortable with the associated objectives pre-program and more comfortable post-program. After program completion, the average comfort range for all overlapping program learning objectives was between 4.4 and 5.6 for C1 and 4.4 and 5.7 for C2, with a maximum of very comfortable at 6. Retro-pre-self-assessments, conducted after program completion, largely recapitulated pre-program scoring, thus validating the participant’s sense of self-growth.

Manager Assessments

Assessments were completed six months post-program completion by the managers of 26 participants from the first cohort of 32. Six missing assessments are due to staff turnover or no response. Data from C2 are not included, as they had not yet achieved 6 months post-program completion at the time of publication. Manager-scored assessments for C1 captured overall recruitment and retention competency levels for each participant before and six months after the program. As shown in Fig. 4 Panel A, growth in retention competency levels of C1 participants is apparent with an increase from 19% scored as Advanced pre-program to 50% scored Advanced post-program. Similar growth is shown for the overall recruitment competency with 31% scored as Advanced pre-program and 65% Advanced post-program.

Fig. 4. Panel A: The count of Clinical Research Professionals from Cohort 1 (N = 26 Total Participants) who achieved each competency level according to manager-scored assessments completed before and after program completion.

Fig. 4. (Continued) Panel B: The count of Clinical Research Professionals from Cohort 1 (N = 26 Total Participants) that correspond to each manager-reported level of responsibility for recruitment and retention-related tasks before and after program completion: NA (not part of current job), GUI (does with guidance or assists), IND (does independently), LEAD (leads, trains, or mentors others in task).

The individual components comprising this manager-scored assessment show similar competency growth as shown in Fig. 4 Panel B. The highest level of responsibility growth was in “identifying systemic retention issues,” with 17 participants performing the task independently (13) or leading efforts (4) pre-program and 22 participants performing these tasks independently (11) or leading efforts (11) six months after program completion. Other notable growth occurred in “recommending changes to the recruitment process” (with 4 of 26 students leading the tasks pre-program and 12 leading post-program) and “identifying and reporting recruitment issues” (with 9 of 26 students leading pre-program and 15 leading post-program). Competence in each of the three tasks mentioned above is arguably critical for expanding staff confidence to identify current and future studies that could have more robust, EDI-focused recruitment and retention plans and to recommend more inclusive practices that they learned throughout the program. N/A responses within the manager assessment indicate that an individual does not perform the task as part of their current job. We suspect that growth of N/A responses from pre to post for some tasks indicates that individuals have changed roles or their managers had better understanding after program completion of what is involved in that task, and whether the employee performs it.

Post-Class Satisfaction Surveys and Participant Interviews

Each class included a post-class satisfaction survey with responses used to iterate course objectives and content. For example, feedback received from participants in C1 necessitated changes to the structure and content covered in a few electives (See Table 1: E1, E2, E3, E4, E5, E6, E8).

Semi-structured interviews with 6 program participants contributed additional context around reasons for participation and impact of the program on daily work. An interview guide (Supplementary Materials F) focused on missing content, use and sharing of gained knowledge, barriers to incorporating learned concepts, and desired additional opportunities. We conducted an inductive thematic content analysis of the de-identified interview transcripts to identify common topics and themes across interviews. Following review of each transcript, the reviewers (JC and JR) catalogued summary topics mentioned by each interviewee and categorized them. As a final step, the categories were compared across interviews to identify themes and to ensure agreement between the two reviewers. The overarching themes that arose most frequently include the following:

  1. 1. EDI focus as a motivating factor to participate and key takeaway (6/6 interviewees – 21 mentions)

  2. 2. Appreciation and/or desire for more opportunities to connect and learn from the other CRPs enrolled in the program (6/6 interviewees – 16 mentions)

  3. 3. Application of the resources shared in the courses (6/6 interviewees – 15 mentions)

  4. 4. A desire for more “tips and tricks” and practical examples of how to implement strategies to improve ER&R efforts (6/6 interviewees – 12 mentions)

  5. 5. PI or unit buy-in and receiving a set protocol from the study sponsor as barriers to applying strategies learned (5/6 interviewees – 9 mentions)

  6. 6. General appreciation for the program itself (5/6 interviewees – 8 mentions)

Overall, results from the interviews and post-class evaluations indicated a positive experience, immediate application of many resources shared, and valuable takeaways from the perspective of the CRPs who completed the program. The interviewees provided instrumental feedback regarding future program improvements including the need to incorporate more time for sharing practical strategies, and providing learners with additional opportunities to connect with one another. Importantly, a few of our interviewees commented on confidence, confirming our hope that this program would increase their confidence in influencing ER&R efforts for the studies for which they are responsible. For example, interviewee 6 stated:

“This program really gave me the confidence to speak up and say ‘this is what has been successful for me,’ or you know ‘thanks for sharing that problem with the group, here’s resource that might be helpful’ or ‘this is what I would suggest,’ and so I think it just kind of gave me the voice to feel confident doing that. This just really gave me the confidence to speak up and have resources or evidence to back it up… I think other times, where I may have been quiet in meetings, now I’m like, actually, you know ‘here’s my idea and here’s where I got it from,’ and I feel very comfortable and confident doing that and I’m not sure if I would have without the program overall.”

Finally, it is notable that the most cited barriers to implementing what they learned are those around investigator buy-in and receiving sponsored protocols that are rigid and under-budgeted for proactive EDI, recruitment, and retention efforts.

Discussion and Next Steps

Implementation of this program began with a draft list of topics to support skill development amongst clinical research staff as envisioned by one person (JR) based on years of experience working in the ER&R space. Turning that list into a viable curriculum required the effort and resourcefulness of a multidisciplinary team of individuals who appreciated the need and envisioned the value of such an endeavor. Identifying and engaging volunteer SMEs from across Duke was simple; however, meeting the demands of their personal calendars and schedules, especially as the world rolled into a full-blown pandemic, proved one of our greatest challenges. Designing a local curriculum with an expectation for eventual sharing with other research institutions represented a challenge. However, it also was an opportunity that led to identification and implementation of efficient tools, such as e-learning modules and facilitator guides.

Development of our program required significant personnel resources. These efforts were instrumental in creating a framework for the program that is available to other institutions by request. Over the first 2 years, the program development and implementation required approximately one full-time equivalent, split across roughly 3 to 4 primary personnel, plus volunteered effort from the 21 facilitators. Duke has a centrally funded clinical research support office [Reference Snyder, Brouwer and Ennis28] and CTSA-supported staff who were able to serve in program management and education coordinator roles to support building the program. Buy-in from CRPs, managers, and volunteer SMEs/facilitators across the institution who recognized the need and value of training was also an important component.

Session feedback permitted us to effectively evolve the program, resulting in greater self-perceived learning from C1 to C2. Courses where changes were made to the design showed greater average growth in comfort than those that did not include additional blended learning elements. Every course that added an additional e-learning module showed greater overall percentage increase in comfort for C2 over C1. The two courses with most notable growth for C2 over C1, E3: Telling the Story of Your Research and E6: Remote Informed Consent, may be attributed to changes made to the course materials, including incorporation of additional meaningful activities and practical examples, in response to participant feedback.

Both cohorts showed low initial comfort levels and high percentages of increase in the course objectives related to community and stakeholder engagement. Our hope is that significant growth in this area will embolden CRPs to form equitable and inclusive partnerships within our communities and to promote the benefit of such partnerships to their faculty investigators. Importantly, some of the strongest areas of average comfort growth were in EDI-related objectives. The courses with the least comfort growth on average due to high pre-program scores, C7: Building Trust and Partnerships and C6: Active Listening to Enhance Respect and Awareness of Perspectives, may reflect some degree of social desirability in the way people respond to their comfort with objectives such as C6-O2: recognizing the importance of active listening and C7-O1: defining trust and trustworthiness. Individuals may not wish to think of themselves as “bad listeners” or “untrustworthy.” What we do not assess, and perhaps should, is whether the pressure of the workday has an impact on CRPs’ active listening skills and reduces their ability to build trusting partnerships as a result.

Both the self-assessments of comfort and the manager-scored assessments of competency showed clear growth across ER&R-related objectives and tasks. The manager assessments completed for C1, in particular, show a clear indication of competency growth and an increase in ER&R task responsibility, including for those competencies that will hopefully bolster CRP confidence in identifying issues and recommending more inclusive practices for their studies. We recognize that managers were aware of their staff’s participation in the ER&R program which may confound post-program scoring. However, time between pre- and post-assessment lends confidence that these assessments reflect real behavior change.

Ultimately, the tangible measure of effectiveness of a program such as the ER&R Certificate will be an increase in participation and retention particularly among underrepresented populations. However, such measures require significant time post-intervention to realize. Moreover, confounding factors may make direct causal determinations difficult, such as additional enterprise-wide EDI initiatives that are being incorporated into the fabric of the institution, and the COVID-19 pandemic falling within the time span of the program. Much positive change is occurring across all Academic Medical Centers – with more intentional community engagement and unprecedented efforts to dismantle systemic racism and advance health equity in health care and researchFootnote 2 – thus, it is difficult to decipher what improvements are solely related to this specific effort. Our goal will be to continue to evaluate measures of inclusion and representation across Duke and especially related to programs impacted by our ER&R participants over the course of the next several years. Initiatives are already underway to better evaluate our enrollment and retention rates across all populations, data that will be made openly available to our community and ourselves so that we may hold ourselves accountable (and be held accountable by the community we serve) for improvements. This will enable us to look at the broad impact of all initiatives directed toward fostering more inclusive participation.

Next Steps

We have invited at least one participant from another institution into each cohort, including individuals from UNC-CH and DTCC. Future iterations of the program will be offered in tandem with the DTCC Clinical Research Equity Scholars Program [29], so that students in the DTCC program may attend our courses as part of their experience. We see an opportunity to expand this collaboration to other institutions with similar programs to prepare participants for careers in clinical research.

The steering committee is licensing the program content repository to share with other institutions under a Creative Commons license, allowing them to custom tailor it to meet their needs and unique characteristics. Sharing this repository of work will allow other institutions to implement similar programs to benefit their CRPs without as much effort or cost. We are currently piloting this implementation initiative with colleagues at UNC-CH. We invite large academic health centers that implement similar programs to consider potential partnerships between institutions, like ours with DTCC, and to include local community clinics and other small local organizations where feasible. Collaborations like this will make training more accessible for clinical research programs in diverse healthcare settings and for individuals on clinical research career paths. Similarly, program content will be included in an e-library using Duke University’s Medical Center Library and Archives LibGuide resource. This will create an open-to-the-public searchable library of ER&R resources and tools provided within the program as well as additional resources as they are identified.

Conclusion

Barriers to equitable ER&R exist at every level of clinical research participation opportunities, including those at the study design level (overly restrictive eligibility criteria and burdensome participation demands, etc.), system level (lack of flexible scheduling opportunities, resources and staffing, academic promotion and tenure practices that don’t acknowledge the real costs of managing research, stereotypes, biases, etc.), and community level (lack of knowledge, not being informed, distrust due to past atrocities, and lack of access). Each of these issues can and should be addressed both independently and in complementary ways, including educating investigators to mitigate barriers for which they have some measure of control (e.g. equitable recruitment practices, anti-bias and anti-racism, less-burdensome study designs, adequate ER&R funding or resources). According to program participants, some barriers to ER&R are built into the study protocols they are asked to implement. While this program iteration focused on staff development, as a next step we see value in using the ER&R program resources to design workshops for investigators to help them identify and mitigate these barriers before study protocols and budgets are finalized. The Just Ask content itself is a component of the recently released joint recommendations by the American Society of Clinical Oncology and Association of Community Cancer Centers for increasing racial and ethnic EDI in cancer clinical trials [30,Reference Oyer, Hurley and Boehmer31]. The Just Ask Training Program has already been widely adopted and used for 75 cancer clinical research sites nationally, including academic centers and community clinics, and was found effective in building awareness and skills toward addressing inequities in clinical research participation [30,Reference Oyer, Hurley and Boehmer31]. Additionally, sponsors and agencies such as the NIH and FDA are taking crucial steps toward addressing barriers through policy and guidance for clinical studies conducted in partnership with, or funded by, them [32,33].

With a baseline of trained staff and steady nomination numbers, the ER&R program will now be offered annually at Duke and also be available to students in CRP training programs at North Carolina Central University, UNC-CH, and DTCC. For the Duke CRPs who have completed the program, we are developing a train-the-trainer workshop that will allow them to use program materials to transfer the knowledge they have gained to colleagues in their units and serve as program facilitators if desired. Our hope is that this knowledge-sharing, along with the continually accessible wiki and e-library, will lead to more CRPs with awareness of the importance of inclusive and equitable recruitment to clinical research studies. To that end, we think it is important to continue constructing ways to build a community of participant-facing staff who are competent in ER&R practices and well-equipped to confidently champion inclusive practices in clinical research and trials. Results indicate that this program has been a step in the right direction toward expanding comfort and competence in these critically interrelated concepts for CRPs at Duke.

Supplementary Material

To view supplementary material for this article, please visit https://doi.org/10.1017/cts.2022.456.

Acknowledgements

The authors would like to acknowledge funding for the ER&R program initiative from Duke’s CTSA grants UL1TR002553 and UL1TR001117. Research reported in this publication was supported by the National Center For Advancing Translational Sciences of the National Institutes of Health under Award Number UL1TR002553. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Additionally, the authors acknowledge the valuable contributions, including course development/facilitation and program management, from the following individuals: Brian Southwell, Ph.D., Camila Reyes, Cecilia Chamorro, Chris Simon, Ph.D., Christopher Barrett Bowling, M.D., David Klein, Ph.D., Deborah Fidelman, Eve Marion, Holly Tiemann, Jennifer Thomas, M.P.H., Judy Hall, Julie Opiteck-Carpenter, Karl Bates, Kelly Keefe, Kenisha Bethea, M.P.H., Leatrice Martin, Leonor Corsino, M.D., Michael Musty, Nicholas Eberlein, Pamela Maxson, Ph.D., Patty McAdams, Renee Leverty, Rowena Dolor, M.D., Sabrena Mervin-Blake, Sameer Cheema, Sierra Lindo, Vincent Miller.

Data were collected and managed using REDCap electronic data capture tools hosted at Duke University School of Medicine [Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde26]. REDCap (Research Electronic Data Capture) is a secure, web-based software platform designed to support data capture for research studies, providing 1) an intuitive interface for validated data capture; 2) audit trails for tracking data manipulation and export procedures; 3) automated export procedures for seamless data downloads to common statistical packages; and 4) procedures for data integration and interoperability with external sources.

Disclosures

The authors have no conflicts of interest to declare.

Footnotes

1 The steering committee reviewed the post-class evaluation surveys for each class and made changes to course content and objectives to ensure that the entire program continues to evolve to meet the needs of the participants and respond to their feedback. Hence, objectives, content and course offerings have evolved.

2 To learn more about Duke’s efforts across the Health System and Schools of Medicine and Nursing, please visit the Duke Health Moments to Movement website.

References

Carlisle, B, Kimmelman, J, Ramsay, T, MacKinnon, N. Unsuccessful trial accrual and human subjects protections: an empirical analysis of recently closed trials. Clin Trials 2015; 12(1): 7783. DOI 10.1177/1740774514558307 CrossRefGoogle ScholarPubMed
Treweek, S, Lockhart, P, Pitkethly, M, Cook, JA, Kjeldstrom, M, Johansen, M. Methods to improve recruitment to randomised controlled trials: Cochrane systematic review and meta-analysis. BMJ Open 2013; 3: e002360. DOI 10.1136/bmjopen-2012-002360 CrossRefGoogle ScholarPubMed
Anderson, DL. A Guide to Patient Recruitment: Today’s Best Practices and Proven Strategies. Boston, MA: CenterWatch/Thomson Healthcare, 2001.Google Scholar
Sullivan, J. Subject recruitment and retention: barriers to success. Applied Clinical Trials Online [Internet], 2004 [cited February 15, 2022]. (https://www.appliedclinicaltrialsonline.com/view/subject-recruitment-and-retention-barriers-success).Google Scholar
Durant, RW, Wenzel, JA, Scarinci, IC, et al. Perspectives on barriers and facilitators to minority recruitment for clinical trials among cancer center leaders, investigators, research staff, and referring clinicians: enhancing minority participation in clinical trials (EMPaCT). Cancer 2014; 120(7): 10971105. DOI 10.1002/cncr.28574 CrossRefGoogle Scholar
Murthy, VH, Krumholz, HM, Gross, CP. Participation in cancer clinical trials: race-, sex-, and age-based disparities. JAMA 2004; 291(22): 27202726. DOI 10.1001/jama.291.22.2720 CrossRefGoogle ScholarPubMed
Niranjan, SJ, Martin, MY, Fouad, MN, et al. Bias and stereotyping among research and clinical professionals: Perspectives on minority recruitment for oncology clinical trials. Cancer 2020; 126: 19581968. DOI 10.1002/cncr.32755 CrossRefGoogle ScholarPubMed
Barrett, NJ, Rodriguez, EM, Iachan, R, et al. Factors associated with biomedical research participation within community-based samples across 3 National Cancer Institute-designated cancer centers. Cancer 2020; 126(5): 10771089. DOI 10.1002/cncr.32487 CrossRefGoogle ScholarPubMed
The Society for Women’s Health Research and United States Food and Drug Administration Office of Women’s Health. Dialogues on Diversifying Clinical Trials: Successful Strategies for Engaging Women and Minorities in Clinical Trials, 2011. (https://www.fda.gov/downloads/ScienceResearch/SpecialTopics/WomensHealthResearch/UCM334959.pdf)Google Scholar
Rodriguez-Torres, E, Gonzalez-Perez, MM, Diaz-Perez, C. Barriers and facilitators to the participation of subjects in clinical trials: An overview of reviews. Contemporary Clinical Trials Communications 2021; 23: 100829. DOI 10.1016/j.conctc.2021.100829 CrossRefGoogle Scholar
Barrett, NJ, Ingraham, KL, Vann Hawkins, T, Moorman, PG. Engaging African Americans in research: The recruiter’s perspective. Ethnicity Disease 2017; 27(4): 453462. DOI 10.18865/ed.27.4.453 CrossRefGoogle ScholarPubMed
ACRP Course Catalogue [Internet]. The Association of Clinical Research Professionals, 2022. (https://acrpnet.org/training/)Google Scholar
Explore Our Courses [Internet]. The Collaborative Institutional Training Initiative (CITI Program), 2022. (https://about.citiprogram.org/courses/)Google Scholar
Conferences & Education [Internet]. Society of Clinical Research Associates, 2022. (https://www.socra.org/conferences-and-education/)Google Scholar
Kusnoor, SV, Villalta-Gil, V, Michaels, M, et al. Design and implementation of a massive open online course on enhancing the recruitment of minorities in clinical trials - Faster Together. BMC Medical Research Methodology 2021; 21(1): 44. DOI 10.1186/s12874-021-01240-x CrossRefGoogle ScholarPubMed
Townsend, D, Mills, N, Savović, J, et al. A systematic review of training programmes for recruiters to randomised controlled trials. Trials 2015; 16: 432. DOI 10.1186/s13063-015-0908-6 CrossRefGoogle ScholarPubMed
Niyibizi, N, McIntosh, S, Hudson, BL, Sipocz, A, Paku, E, Dykes, C. CTSA recruitment resources: An inventory of what CTSA hubs are currently offering. Journal of Clinical and Translational Science 2020; 4: 529536. DOI 10.1017/cts.2020.44 CrossRefGoogle ScholarPubMed
Anwuri, VV, Hall, LE, Mathews, K, et al. An institutional strategy to increase minority recruitment to therapeutic trials. Cancer Causes Control 2013; 24(10): 17971809. DOI 10.1007/s10552-013-0258-1 CrossRefGoogle ScholarPubMed
Delaney, H, Devane, D, Hunter, A, et al. Limited evidence exists on the effectiveness of education and training interventions on trial recruitment; a systematic review. Journal of Clinical Epidemiology 2019; 113: 7582. DOI 10.1016/j.jclinepi.2019.05.013 CrossRefGoogle ScholarPubMed
Quinn, GP, Jimenez, J, Meade, CD, et al. Enhancing oncology health care provider’s sensitivity to cultural communication to reduce cancer disparities: a pilot study. Journal of Cancer Education 2011; 26(2): 322325. DOI 10.1007/s13187-011-0223-1 CrossRefGoogle ScholarPubMed
Branson, RK, Rayner, GT, Cox, JL, Furman, JP, King, FJ, Hannum, WH. Interservice Procedures for Instructional Systems Development: Executive Summary and Model. (Vol. 1–5) TRADOC Pam 350–30. Ft. Monroe, VA: U.S. Army Training and Doctrine Command, 1975.Google Scholar
Stavredes, T, Herder, T. A Guide to Online Course Design: Strategies for Student Success. San Francisco, CA: John Wiley & Sons, 2014.Google Scholar
Deeter, C, Hannah, D, Reyes, C, et al. Professional development for clinical research professionals: Implementation of a competency-based assessment model. Journal of Research Administration 2020; 51(2): 1540.Google Scholar
Mayer, R. Cognitive theory of multimedia learning. In: Mayer, R, ed. The Cambridge Handbook of Multimedia Learning. Cambridge: Cambridge University Press, 2014, pp. 4371. DOI 10.1017/CB09781139547369.005 CrossRefGoogle Scholar
Lombardozzi, C Learning environments by design. Alexandria, VA: ATD Press, 2015.Google Scholar
Harris, P, Taylor, R, Thielke, R, Payne, J, Gonzalez, N, Conde, JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics 2009; 42(2): 377381.CrossRefGoogle ScholarPubMed
JR, Mattox, Van Buren, M, Martin, J. Learning Analytics: Measurement Innovations to Support Employee Development. Philadelphia, PA: Kogan Page Publishers, 2016.Google Scholar
Snyder, DC, Brouwer, R, Ennis, CL, et al. Retooling Institutional Support Infrastructure for Clinical Research. Contemporary Clinical Trials 2016; 48: 139145.CrossRefGoogle ScholarPubMed
Duke Clinical and Translational Science Institute Newsroom. Internship Provides Hands-On Clinical Research Experience with Equity Lens, 2022. (https://ctsi.duke.edu/news/internship-provides-hands-clinical-research-experience-equity-lens)Google Scholar
American Society of Clinical Oncology. News Releases: Diversify Your Cancer Clinical Trials with New Recruitment and Retention Resources, 2022. (https://www.asco.org/about-asco/press-center/news-releases/diversify-your-cancer-clinical-trials-new-recruitment-and)Google Scholar
Oyer, RA, Hurley, P, Boehmer, L, et al. Increasing racial and ethnic diversity in cancer clinical trials: An American Society of clinical oncology and association of community cancer centers joint research statement. Journal of Clinical Oncology 2022; 40(19): 21632171. DOI 10.1200/JCO.22.00754 CrossRefGoogle Scholar
U.S. Food & Drug Administration. Guidance Document: Enhancing the Diversity of Clinical Trial Populations — Eligibility Criteria, Enrollment Practices, and Trial Designs Guidance for Industry, 2020. (https://www.fda.gov/regulatory-information/search-fda-guidance-documents/enhancing-diversity-clinical-trial-populations-eligibility-criteria-enrollment-practices-and-trial)Google Scholar
National Institute on Minority Health and Health Disparities. The NIH Minority Health and Health Disparities Strategic Plan, 2022. (https://nimhd.nih.gov/about/strategic-plan/)Google Scholar
Figure 0

Fig. 1. Program implementation and launch timeline for the first cohort (C1) and second cohort (C2) of the Engagement, Recruitment, and Retention program.

Figure 1

Table 1. Core (C) and elective (E) courses and learning objectives (O#). The course title column lists program courses, blended learning elements, and any changes from cohort 1 to cohort 2. Double asterisks (**) in the course title column indicate changes or elements that were added for cohort 2 based on feedback and expansion of blended program design. The learning objectives column lists the objectives for each course. Objectives in bold were included in both cohort 1 and cohort 2. Objectives beginning with a single asterisk (*) are related to the program equity, diversity, and inclusion (EDI) lens given the content covered.

Figure 2

Fig. 2. Recruitment and Retention tasks addressed in manager pre- and post-assessments of task responsibility.

Figure 3

Table 2. Participant information captured at nomination from cohort 1 and cohort 2. The table displays job title, length of time in clinical research, percentage of effort (time) spent on recruitment and retention tasks, and demographic information captured from 55 of 59 students, including race, ethnicity, sex, and age.

Figure 4

Table 3. The 10 learning objectives with the highest percent increase in comfort according to cohort 1 and cohort 2 self-assessments. Bolded rows reflect community and stakeholder engagement-related objectives. Objectives beginning with a single asterisk (*) are related to the program’s equity, diversity, and inclusion (EDI) lens given the content covered

Figure 5

Fig. 3. Average comfort level (very uncomfortable (1) to very comfortable (6)) with overlapping course objectives before and after program completion. Table 1 outlines the specific course objectives that map to the codes in this figure (e.g. C1-01 = Core 1, Objective 1 and E2-01 = Elective 2, Objective 1). A single asterisk (*) indicates courses where an additional e-learning module was added as pre-learning prior to holding the course for Cohort 2 (C2).

Figure 6

Fig. 4. Panel A: The count of Clinical Research Professionals from Cohort 1 (N = 26 Total Participants) who achieved each competency level according to manager-scored assessments completed before and after program completion.

Figure 7

Fig. 4. (Continued) Panel B: The count of Clinical Research Professionals from Cohort 1 (N = 26 Total Participants) that correspond to each manager-reported level of responsibility for recruitment and retention-related tasks before and after program completion: NA (not part of current job), GUI (does with guidance or assists), IND (does independently), LEAD (leads, trains, or mentors others in task).

Supplementary material: File

Cranfill et al. supplementary material

Cranfill et al. supplementary material

Download Cranfill et al. supplementary material(File)
File 171.3 KB