Hostname: page-component-586b7cd67f-r5fsc Total loading time: 0 Render date: 2024-11-23T21:22:07.318Z Has data issue: false hasContentIssue false

Evaluating the impact of a CTSA program from 2008 to 2021 through bibliometrics, social network analysis, and altmetrics

Published online by Cambridge University Press:  11 January 2023

Fei Yu*
Affiliation:
Health Sciences Library, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
Tanha Patel
Affiliation:
North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
Andrea Carnegie
Affiliation:
North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
Gaurav Dave
Affiliation:
North Carolina Translational and Clinical Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA
*
Address for correspondence: F. Yu, PhD, Health Sciences Library, University of North Carolina-Chapel Hill, Chapel Hill, NC 27599, USA. Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

We evaluate a CTSA program hub by applying bibliometrics, social network analysis (SNA), and altmetrics and examine the changes in research productivity, citation impact, research collaboration, and CTSA-supported research topics since our pilot study in 2017.

Methods:

The sampled data included North Carolina Translational and Clinical Science Institute (NC TraCS)-supported publications produced between September 2008 and March 2021. We applied measures and metrics from bibliometrics, SNA, and altmetrics to the dataset. In addition, we analyzed research topics and correlations between different metrics.

Results:

1154 NC TraCS-supported publications generated over 53,560 citation counts by April 2021. The average cites per year and the relative citation ratio (RCR) mean of these publications improved from 33 and 2.26 in 2017 to 48 and 2.58 in 2021. The number of involved UNC units in the most published authors’ collaboration network increased from 7 (2017) to 10 (2021). NC TraCS-supported co-authorship involved 61 NC organizations. PlumX metrics identified articles with the highest altmetrics scores. About 96% NC TraCS-supported publications have above the average SciVal Topic Prominence Percentile; the average approximate potential to translate of the included publication was 54.2%; and 177 publications addressed health disparity issues. Bibliometric measures (e.g., citation counts, RCR) and PlumX metrics (i.e., Citations, Captures, and Social-Media) are positively correlated (p < .05).

Conclusion:

Bibliometrics, SNA, and altmetrics offer distinctive but related perspectives to examine CTSA research performance and longitudinal growth, especially at the individual program hub level. These perspectives can help CTSAs build program foci.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - SA
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike licence (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the same Creative Commons licence is used to distribute the re-used or adapted article and the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

Clinical and translation science (CTS) encompasses multistage scientific investigations from fundamental discoveries in the laboratory, clinic, and community to interventions transformed as new treatments and approaches to improving the health of individuals and populations [1]. The National Center for Advancing Translational Sciences (NCATS) in the United States initiated the Clinical and Translational Science Awards (CTSA) program in 2006 and has invested about half a billion dollars annually on a national network of more than 50 medical research institutions (also called “hubs”) [2]. CTSA hubs vary in sizes, goals, priorities, services, and geographic locations, but all aim to accelerate the translation of scientific discoveries to improved patient care.

Evaluating a CTSA program hub is essential since they expend massive public funding annually [3] and spend considerable time and resources building the CTS pipeline. However, it is complicated and challenging to demonstrate that a CTSA hub is “well implemented, efficiently managed, adequately resourced and demonstrably effective,” as stated in CTSA-specific evaluation guidelines [Reference Trochim, Rubio and Thomas4]. Therefore, CTSA evaluators have explored an array of feasible evaluation approaches, measures, and models, including common metrics [Reference Rubio, Blank and Dozier5], logic models [6], return on investment model [Reference Grazier, Trochim, Dilts and Kirk7], Developmental Evaluation and the Context Input Process Product Model [Reference Zhang, Zeller and Griffith8], payback framework [Reference Rollins, Llewellyn, Ngaiza, Nehl, Carter and Sands9], and a mixed-methods approach of logic models and expert panel evaluation [Reference Wooten, Rose, Ostir, Calhoun, Ameredes and Brasier10]. Bibliometrics [Reference Yu, Van and Patel11Reference Sayavedra, Hogle and Moberg15] and social network analysis (SNA) [Reference Bian, Xie, Topaloglu, Hudson, Eswaran and Hogan16Reference Vacca, McCarty, Conlon and Nelson18] have also been explored for their feasibility to chart the research outcome, collaboration, and impact of CTSA-supported activities.

Bibliometrics has been widely used to outline the research landscape and disclose the direct outcome and impact of scientific investigations through quantitatively analyzing a chosen group of publications. In biomedical and health sciences, bibliometrics is a core method to evaluate research impact [Reference Milat, Bauman and Redman19]. The National Institutes of Health (NIH) requires each CTSA program hub to track and report the annual publication count. A CTSA consortium-led evaluation workgroup also identified shared interests in using publication analysis to assist in assessing annual programs of individual CTSAs [Reference Frechtling, Raue, Michie, Miyaoka and Spiegelman20]. Recent evaluation studies by CTSAs also confirmed the validity and feasibility of bibliometrics as a critical approach to CTSA-supported translational research evaluation [Reference Llewellyn, Carter, DiazGranados, Pelfrey, Rollins and Nehl12,Reference Schneider, Kane and Rainwater14]. For example, bibliometrics has been applied to assess (1) an individual CTSA program hub [Reference Yu, Van and Patel11], a group of CTSA program hubs [Reference Schneider, Kane and Rainwater14], overall CTSA consortium or a specific program across CTSAs [Reference Llewellyn, Carter, DiazGranados, Pelfrey, Rollins and Nehl12,Reference Llewellyn, Carter, Rollins and Nehl13,Reference Sayavedra, Hogle and Moberg15,Reference Qua, Yu, Patel, Dave, Cornelius and Pelfrey21]; (2) research productivity and citation impact using both basic publication/citation counts and advanced citation impact indicators (e.g., iCite’s relative citation ratio, Elsevier’s Field-Weighted-Citation-Impact, and Web of Science’s Category Normalized Citation Impact) [Reference Yu, Van and Patel11Reference Sayavedra, Hogle and Moberg15,Reference Qua, Yu, Patel, Dave, Cornelius and Pelfrey21]; (3) interdisciplinary or inter-CTSA collaborations [Reference Yu, Van and Patel11,Reference Llewellyn, Carter, DiazGranados, Pelfrey, Rollins and Nehl12]; and (4) research areas align with the translational spectrum [Reference Yu, Van and Patel11,Reference Llewellyn, Carter, DiazGranados, Pelfrey, Rollins and Nehl12].

Another CTS evaluative approach - SNA focuses on the patterns of interaction between social entities [Reference Bian, Xie and Hudson22]. It is particularly suited to help understand multidisciplinary collaborations and team science essential for CTS’s success [Reference Nagarajan, Lowery and Hogan23,Reference Luke, Carothers and Dhand24]. Several CTSA programs produced use cases of applying SNA to evaluate the impact of their supported translational teams. These programs used grants, publications, and surveys to measure and visualize the temporal evolution and cross-discipline collaboration patterns. For example, SNA was adopted to model upon grant data to compare the biomedical research collaborations before and after CTSA awards [Reference Nagarajan, Peterson, Lowe, Wyatt, Tracy and Kern17] or to disclose “influential” researchers and identify “potential new collaborations” [Reference Bian, Xie, Topaloglu, Hudson, Eswaran and Hogan16]. Publications were either used as the sole data source to expand SNA to bibliometric network analysis by examining co-authorship [Reference Yu, Van and Patel11,Reference Sorensen, Seary and Riopelle25] or combined with grants to explore CTSA-supported research collaboration patterns [Reference Vacca, McCarty, Conlon and Nelson18,Reference Luke, Carothers and Dhand24]. In addition, a couple of CTSA hubs used survey data to investigate collaboration networks at macro- (i.e., entire network) and meso-levels (e.g., across departments) [Reference Vacca, McCarty, Conlon and Nelson18,Reference Dozier, Martina and O’Dell26]; or design a program to create collaborations between previously unconnected researchers [Reference Vacca, McCarty, Conlon and Nelson18]. Therefore, CTSA evaluators have experiences of adopting both SNA and bibliometrics to understand the scale and scope of their supported teamwork, identify missing connections, connect researchers, and improve team effectiveness.

Finally, due to increasing social media usage in scholarly communication, research enterprise stakeholders (e.g., sponsors, researchers, and evaluators) have pressed for alternative metrics to improve the evaluation of research output, also known as altmetrics [27]. Altmetrics is complimentary to the citation-based metrics for research impact evaluation by tracking immediate online attention within the scientific community such as usage (e.g., downloads, views), mentions (e.g., news, blogs, Wikipedia), and social media (e.g., Twitter/Facebook) [Reference Akers28]. Researchers have extensively applied altmetrics to measuring or identifying the social impact of health sciences research [Reference Giustini, Axelrod, Lucas and Schroeder29,Reference Punia, Aggarwal, Honomichl and Rayi30]. In addition, quite a few studies explored the correlation between traditional citation measures (e.g., citation counts) and altmetrics [Reference Giustini, Axelrod, Lucas and Schroeder29Reference Luc, Archer and Arora33]. Two new CTSA evaluation studies reported using both biblometrics and altmetrics to assess the short- and long-term impact of translational research [Reference Llewellyn, Weber, Fitzpatrick and Nehl34] and exploring the association between those measures [Reference Llewellyn, Weber and Nehl35], further validating the potential of using both bibliometric and altmetrics measures for CTSA evaluations.

Therefore, building on our previous study [Reference Yu, Van and Patel11] that assessed the bibliometrics approach for publications citing North Carolina Translational and Clinical Institute (NC TraCS), CTSA hub for the University of North Carolina at Chapel Hill (UNC-CH) from 2008 to April 2017, this study took a mixed-metrics approach by applying bibliometrics, SNA and altmetrics to an expanded publication year range (i.e., 2008–March 2021). Particularly, we provide insights into the potential influence of several programmatic changes at NC TraCS including the creation of two new programs (e.g., Inclusive Science and Team Science), the inclusion of a required Community and Participant Engagement Plan for all our pilot grant applications, targeted pilot grant Request for Applications (RFAs) focusing on addressing health equity, and the creation of a formal partnership with North Carolina Agricultural and Technical University (NC A&T), the largest Historically Black Colleges and University (HBCU) in North Carolina. During this period (2017–2021), the world, nation, and UNC-CH were also impacted by and responded to the COVID-19 pandemic, resulting in potential changes in our supported research output.

In addition to the metrics and measures in our 2017 pilot study, this study examined two new metrics for bibliometric topic measuring (i.e., Topic Prominence Percentile and Approximate Potential to Translate). Particularly, we investigated NC TraCS-supported research topics that are pertinent to health disparity. We explored the following research questions (RQ):

RQ 1: How has the research productivity and impact of NC TraCS-supported CTS enterprise at UNC-CH changed since 2017?

RQ 2: How has the research collaboration catalyzed by NC TraCS-supported research changed since 2017?

RQ3: (a) How are the NC TraCS-supported research topics ranked upon prominence and translational potential, and (b) How do the research topics address health disparities?

RQ4: Is there any relationship between bibliometric, altmetric, and research topic measures?

Methods

Data Sample

We include NC TraCS-supported publications from September 1, 2008, to the most recent NIH annual progress report period, March 23, 2021, resulting in a total of eligible 1154 publications. We define NC TraCS-supported publications wherein the authors acknowledged and cited the NC TraCS grant as their research support. The bibliographic records of 1154 publications were retrieved from the NC TraCS account at the PubMed/National Center for Biotechnology Information (NCBI), downloaded from PubMed in Medline format, and used as the master data file for analysis.

Data Tools

This study used the following tools to collect and analyze publication data, consistent with what we used in the 2017 study. However, we adopted a few additional bibliometric and topic measures recently made available by the tools below.

Elsevier Scopus covers a broader spectrum of research publications across disciplines than its counterpart - Web of Science [Reference Pranckutė36]. The UNC-CH has maintained an active subscription to Scopus so that we can access citation impact measures (e.g., citation counts and comparative citation impact ratios), SciVal topic prominence percentile (STPP) [37], and PlumX metrics [38]. The citation impact data in this study were gathered in the same manner as the 2017 study by searching and matching the citation fields (i.e., PMID, DOI, and title) of the PubMed-exported publication records in Scopus. We added STPP and PlumX metrics to this study (defined below), which is the first CTSA evaluation utilizing these new Scopus metrics and data sources. While the STPP of each matched Scopus citation was collected through Web Scraper [39], the PlumX metrics were collected via Scopus API [40].

NIH iCite [41]: Since the NIH Office of Portfolio developed and validated Relative Citation Ratio (RCR) [Reference Hutchins, Yuan, Anderson and Santangelo42] as an article-level comparative citation impact indicator, RCR has been frequently used by evaluators and researchers to assess the impact of research publications supported by public funds, including CTSAs [Reference Yu, Van and Patel11,Reference Llewellyn, Carter, Rollins and Nehl13,Reference Sayavedra, Hogle and Moberg15]. iCite designed and launched a Translation module and a new metric - Approximate Potential to Translate (APT) publication. While we can use the Translation module to compare how close to human clinical applications two analysis groups of articles are, the APT score predicts the future translational progress in biomedical research.

VOSviewer (Version 1.6.16) [Reference Van Eck and Waltman43]: As a specialized bibliometric and network analysis application with excellent usability, VOSviewer has been widely used in analyzing and visualizing coauthorship networks across disciplines [44], including our pilot study.

Data Measures

  • Bibliometrics

We continued to use validated bibliometric measures from previous studies and other CTSA bibliometric evaluation reports, including publication counts, citation counts, average cites per year, field-, and time-normalized comparative citation ratios at the article level, such as Field-Weighted Citation Impact (FWCI), Citation Benchmarking (CB), and Relative Citation Ratio (RCR) [Reference Yu, Van and Patel11,Reference Llewellyn, Carter, Rollins and Nehl13Reference Sayavedra, Hogle and Moberg15,Reference Frechtling, Raue, Michie, Miyaoka and Spiegelman20] (Table 1).

Table 1. Data measures, categories, metrics, sources, and analysis tools

*New measures adopted in this study compared to measures in the 2017 pilot study; TraCS, Translational and Clinical Science Institute; CTSA, Clinical and Translational Science Award; FWCI, Field-weighted Citation Impact; CB, citation benchmarking; RCR, Relative Citation Ratio; NC, North Carolina; HBCU, Historical Black Colleges & Universities; STPP, SciVal Topic Prominence Percentile; APT, Approximate Potential to Translate.

  • Collaboration network analysis

The collaboration measures included both intra-organization collaboration (e.g., UNC unit collaboration network) and inter-organization (i.e., inter-NC CTSAs collaboration, Inter-UNC system collaboration, and the collaborations between NC TraCS with HBCUs). Our pilot study found that approximately half of the NC TraCS-supported publications were generated in collaboration with researchers at other CTSA hub institutes [Reference Yu, Van and Patel11]. Therefore, this study explores more granular level collaborations between NC TraCS and local institutions in North Carolina and the HBCUs across the nation.

  • Altmetrics

Altmetric.com and PlumX are the two major commercial altmetrics data providers [Reference Ortega45]. We chose PlumX metrics from Scopus as additional measures for this study. Five comprehensive article-level PlumX metrics were exported via Scopus API on March 30, 2021, including citations (e.g., clinical citations, patent citations), usages (e.g., abstract views, downloads), captures (e.g., bookmarks, reference manager saves), mentions (e.g., blog posts, news, or Wikipedia mentions), and social media (e.g., tweets, Facebook).

  • Topic measures

The topic measures in this study employed two new metrics: STPP and APT. Elsevier developed STPP, an article-level metric to show the current momentum or visibility of a topic [37]. It is calculated by weighing three metrics for a publication clustered in a topic: citation count, Scopus view count, and average CiteScore (Scopus journal impact metric). A high STPP means this topic has high momentum is likely to be well-funded, and thus, has higher grant success rates. The APT score is generated by a machine learning algorithm considering the citations a publication receives by clinical articles and the citation network [41]. The Translation module in iCite provides the average number of articles of an analysis group in three categories (i.e., Human, Animal, and Molecular/Cellular Biology) that are classified based on Medical Subject Heading (MeSH) terms in addition to an average APT score for topic translational potential prediction. Comparing with the topic clustering measure in our 2017 pilot study, which extracted key terms from the titles and abstracts and demonstrated the translational phases qualitatively, we believe the two new metrics (STPP and APT) for topics in this study can capture the impact and translational themes of supported publications more quantitatively.

In addition, this study focused on health disparity addressed by NC TraCS-supported publications. Words, terms, and their variations related to health disparity and inequality were searched against the citation fields (i.e., title, abstract, author keyword, and MeSH terms) of our master dataset (N = 1154), including health disparity (disparities), health equity/inequity, rural health/communities/hospitals, healthcare accessibility, African Americans, Hispanic, Latino/Latinos, race/racial, racism, ethnicity, underserved, minority, people of color, poverty, socioeconomic factors, and population health. Retrieved publications were manually screened for relevancy for topic analysis.

  • Correlation measure

The correlations between metrics of bibliometrics (i.e., citation counts, FWCI, CB, RCR), altmetrics (i.e., PlumX-Captures, Citations, Usages, Mentions, and Social media), and topic measures (i.e., STPP) were assessed by Spearman Rho correlation coefficients [Reference Giustini, Axelrod, Lucas and Schroeder29,Reference Punia, Aggarwal, Honomichl and Rayi30]. This study is the first CTSA evaluation that formally investigated these correlations using the data source provided by Scopus, contributing to the growing literature on this topic [Reference Richardson, Park, Echternacht and Bell46Reference Llewellyn and Nehl49].

Data Analysis

This study compared data measures in 2017 [Reference Yu, Van and Patel11] and 2021. We used Microsoft Excel and SPSS for quantitative statistics and testing. First, we generated descriptive statistics and visualization to demonstrate the longitudinal change of the NC TraCS-supported publications regarding all three measures (bibliometric, collaboration, and topics) from 2008 to 2021. Second, we conducted paired samples t-tests to detect any statistical significance of citation impact (i.e., FWCI, CB) between the analysis of our 2017 pilot and 2021 studies. Third, we calculated Spearman rho correlation coefficients in SPSS for the statistical significance of the correlations. Fourth, we constructed a co-occurrence Medical Subject Headings (MeSH) term network map for the health disparity topic by identifying the most frequently occurring MeSH terms.

Results

Bibliometric Measures

  • Comparison of Research productivity and Citation impact

The average number of NC TraCS-supported publications increased from 82 per year (identified in the 2017 pilot study) to 87 per year identified by this study (Table 2). We excluded the year 2021, representing only the partial output (January to March). Since 2017, both research productivity and citation counts have continued to grow annually (Supplement Figure 1). In addition, the total citation counts of NC TraCS-supported articles increased from 24,010 by April 20, 2017 [Reference Yu, Van and Patel11] to 53,560 by April 27, 2021. The average cites per NC TraCS-supported publication were also improved from 33 times in 2017 to 48 in 2021.

Fig. 1. The STPP distribution of NC TraCS-supported publications (September 2008–March 2021) (N = 1,115) (Note: NC TraCS = North Carolina Translational and Clinical Institute; STPP = SciVal Topic Prominence Percentile; 90–99 percentile (N = 730; 80–89 percentile (N = 217); 70–79 percentile (N = 66); 50–69 percentile (N = 61); 0–49 percentile (N = 36); No data (N = 5)).

Table 2. The comparison of research productivity and citation impact (2017 vs. 2021)

*For both the pilot study and the current study, when calculating the average number of publications per year and the average cites per year, we excluded the partial output of a year (i.e., January to March). For example, in the pilot study, the scholarly output and citation counts of the year 2017 was excluded. RCR, relative citation ratio; SEM, standard error of the mean.

The paired t-test shows that there was not a significant difference in FWCI scores between 2017 (Mean = 3.89, Standard Deviation (SD) = 14.48) and 2021 (M = 4.01, SD = 19.93); t (639) = -0.432, P = .666). However, there was a significant difference in CB scores between 2017 (M = 74.81, SD = 21.10) and 2021 (M = 77.27, SD = 17.35); t (604) = -5.850, P = .000). Given the publication productivity increased 1.5 times since our pilot study, 86.57% of these publications were still above the average CB (50th percentile) in 2021 (Supplement Figure 2) and slightly more than 83% reported in the pilot study.

Fig. 2. High article density of NC TraCS-supported research on human health (N = 1,154) (Note: NC TraCS, North Carolina Translational and Clinical Institute; iCite Translational Module generated the visualization).

Collaboration Measures

  • UNC unit collaboration

Compared with our pilot study, NC TraCS-supported research collaboration has grown in the total number of supported authors, most published authors, the total number of UNC units, and the average number of coauthors of most published authors. Mainly, the most published UNC author (each has >5 publications) collaboration network across internal units (Supplementary Figure 3) shows two more UNC units (i.e., UNC School of Social Work and Renaissance Computing Institute) appeared as additional internal units in this collaboration network compared to our pilot study in 2017. However, during the latest 30-month (09/01/2018–02/28/2021), the average number of coauthors of each most published author in 30 months decreased to 4 authors from 6 in the previous 30-month period examined (03/01/2016–08/31/2018) (Table 3).

Table 3. The comparison of research collaboration (2017 vs. 2021)

Note: UNC, University of North Carolina at Chapel Hill.

  • NC TraCS-supported local collaboration

Producing 1154 publications, NC TraCS-supported researchers coauthored with local researchers from 61 organizations in North Carolina, including two other CTSAs (i.e., Duke University and Wake Forest University) and 8 out of 16 schools in the UNC system (e.g., North Carolina State University, East Carolina University, UNC-Charlotte, UNC-Greensboro) (Table 4).

Table 4. The comparison of research collaboration in North Carolina (2017 pilot vs. 2021) (numbers below represent coauthored publications)

Note: NC, North Carolina; UNC, University of North Carolina at Chapel Hill; NC A&T University, North Carolina Agricultural and Technical State University.

  • NC TraCS-HBCU collaboration

NC TraCS-supported researchers also collaborated with researchers at four HBCUs and one CTSA that an HBCU participated in (i.e., Georgetown-Howard Universities Center for Clinical & Translational Science (GHUCCTS)) in the United States. These collaborations produced five coauthored publications with North Carolina Central University, two with North Carolina A&T State University and Howard University, respectively, and one with Meharry Medical College (Supplementary Table 1).

  • Altmetrics measures

The PlumX metrics scores of 1154 NC TraCS-supported publications are summarized in Supplementary Table 2 while publications with the highest PlumX metrics scores are listed in Supplementary Table 3. Notably, among 52,269 altmetrics citations, there are 287 clinical and 182 patent citations. The total of 619,722 times of usages (e.g., clicks, views, downloads) include 322,158 abstract views and 261,976 full-text views. Out of 108,360 times of total captures, there are 74,334 times Mendeley saves. In addition, these publications were mentioned 118 times in blogs and 1068 times in news; they appeared on Facebook 16,481 times and were tweeted 8,908 times.

Topic Measures

  • SciVal Topic Prominence Percentile (STPP)

Ninety-six percent of NC TraCS-supported publications (N = 1,074) have above the average STPP (>50 percentile), and 64% of the total publications (N = 730) have 90–99 percentile STPP (Fig. 1).

  • Approximate Potential to Translate (APT)

The average APT that NIH iCite generated for the total of 1154 NC TraCS-supported publications is 54.2%, which means the likelihood that a NC TraCS-supported article will be cited by a clinical article is 54.2%. In addition, the average “human” score for NC TraCS-supported articles is 0.80; the average “Animal” score is 0.06, and the average “Mol/Cell” score is 0.11. Overall, NC TraCS-supported research is human and human health-oriented (Fig. 2). According to the Translational Module, there are 524 papers already cited by a clinical article.

  • Focal topics

A total of 177 NC TraCS-supported publications addressed health disparity issues. Fig. 3 demonstrated a co-occurrence network map of MeSH terms associated with these publication records, picturing six characteristics of these studies. 1) Regarding populations and demographics, 127 studies focused on females while 108 studied males, and 58 on both; middle-aged (79 publications) and adult (75) populations are studied more than the other age groups, such as adolescent (30) or aged 80 and over (23). These studies also focused on populations in North Carolina (30), African Americans (31), continental African ancestry group (6), and rural population (14). 2) Regarding disease and health symptoms, HIV infections are the most addressed disease associated with health disparities (18) and followed by breast neoplasms (9). 3) Regarding treatment, 14 studies reported treatment outcomes, 6 investigated antihypertensive agents, and 5 studied highly active antiretroviral theory. 4) Regarding research methods, cross-sectional studies are the most frequently employed (24) and followed by surveys/questionnaires (22) and cohort studies (17). 5) Risk factors (16) and socioeconomic factors (13) are the two significant variables in the studies. 6) Regarding patient-healthcare interaction, NC TraCS-supported research covered a range of topics, including health knowledge, attitudes, practice (14), patient education (11), healthcare disparities (9), and physician–patient relations (9).

Fig. 3. MeSH term co-occurrence network map (N = 177 publications; 72 MeSH terms with co-occurrence >5 times) (Note: MeSH, Medical Subject Heading).

Correlation Measure

The Spearman’s rho testing shows that (1) FWCI, cites (i.e., Scopus citation counts), CB, RCR, PlumX-Citations, PlumX-Captures, and PlumX-Social Media counts are all positively correlated with each other (p < .05); (2) STPP is positively correlated with FWCI, Cites, CB, RCR, PlumX-Citations, PlumX-Mention, and PlumX-Social Media. However, the correlation is not statistically significant between STPP and PlumX-Captures (p > .05), STPP and PlumX-Usage (p > .05), and PlumX-Usage and PlumX-Mentions (p > .05).

Discussion

Bibliometrics

The results from the bibliometric measures show that NC TraCS-supported publications continued to grow in productivity and citation impact after our pilot study in 2017. The total number of NC TraCS-supported publications is about 1.5 times more in 2021 than when they were measured in 2017, boosting the average number of publications per year from 82 (measured in 2017) to 87 currently. Regarding citation influence, the average cites per year and the RCR mean improved from 33 and 2.26 in 2017 to 48 and 2.58 in 2021. Notably, we identified a statistically significant difference of CBs (a time-normalized citation impact measure) of NC TraCS-supported articles between 2017 and 2021, indicating these articles achieved higher citation benchmarking in 2021. In addition, slightly more publications are above the average CB in 2021 (86.57%) than in 2017 (83%), declaring most NC TraCS-supportted papers have been cited higher than the average papers from similar times and fields in the world.

Collaboration Network

Our collaboration measures illustrated an enlarged scale of research collaboration that NC TraCS-supported research has continued to catalyze since 2017. The number of supported authors doubled in 4 years, and the number of involved UNC units in the most published authors’ collaboration network increased from 7 (2017) to 10 (2021). Additionally, the most published authors have persisted in working with more researchers in a 30-month examining period and reached six from March 1, 2016, to August 31, 2018. However, this growing trend was interrupted at the last examining period (September 1, 2008, to February 28, 2021), which could be ascribed to the impact of the COVID-19 pandemic and warrants future studies investigating this phenomenon.

Particularly, NC TraCS-supported research has outreached to local and national HBCUs. For example, the supported publications with NC Central University and NC A&T University doubled since 2017; and researchers at two more HBCUs (Howard University & Meharry Medical College) participated in NC TraCS-supported research projects. The collaboration with HBCUs can be ascribed to the formalized partnership between UNC-CH and NC A&T University and the new Inclusive Science program, which places particular emphasis on groups that have been historically underrepresented in research or who experience significant health disparities in NC.

Altmetrics

In 2017, we could not conduct altmetrics analysis because many included articles did not have sufficient PlumX data. However, in 2021, all included articles have received altmetric citations. We observed a variety of altmetric citation types (e.g., clinical citation, patent citations, usages, blog and news mentions, Tweets, and Facebook, etc.) and identified a few star papers by PlumX measures highlighting their clinical and social impact (Supplementary Table 3).

Topics

The two new topic measures we adopted enabled us to analyze the impact/prominence of supported research topics quantitatively. Measured by STPP, 93% of NC TraCS-supported scientific publications are in the 90th–99th prominence percentile (Fig. 1), indicating extremely high momentum or visibility in the scientific field worldwide. Measured by APT, on average, slightly more than half of the publications are likely to be cited by clinical articles, directly contributing to improving human health. Consistent with the APT, the iCite Translational module also shows that NC TraCS-supported publications cover all three translational categories (i.e., Human, Animal, Molecular/Cellular) with a concentration on the “Human.” Therefore, we can affirm NC TraCS’s continuing effort in supporting the mission of NCATS.

Furthermore, about 15% of NC TraCS-supported publications promote or support health equity and community health by focusing on minority populations (e.g., African Americans, Hispanic Americans), people of color, underserved communities, and patients in rural areas. Notably, the MeSH term “North Carolina” and “Community-based participatory research” disclosed local stakeholders and community engagement with high co-occurrences. These identified focal topics are highly consistent with NC TraCS-modified pilot grant applications to require all applications to include a community and participant engagement plan.

Correlations

Our correlation testing results are consistent with previous studies that traditional citation count is positively correlated with Altmetrics scores (e.g., Altmetric, PlumX metrics) [Reference Maggio, Leroux, Meyer and Artino31Reference Luc, Archer and Arora33]. However, this study went further by testing and identifying positive correlations between comparative citation ratios (e.g., FWCI, RCR) and PlumX metrics, including Citations, Mentions, Captures, and Social media. In addition, the topic measure, STPP, is positively correlated with citation measures and PlumX metrics (i.e., Citations, Mention, and Social Media). This is the first CTSA evaluation that used Scopus measures and data source to explore the correlations between (1) advanced citation measures with altmetrics and (2) between a new topic measure with both citation and altmetrics measures.

In 2017, NC TraCS submitted a CTSA application in response to a new program announcement from NCATS, which emphasized priorities on increasing inclusivity and health equity, and facilitating team science. Our 2017 study provided evidence for advancing our CTSA programming in several areas where NC TraCS responded with a set of new programs to address these priorities including Inclusive Science program, Team Science, Community and Participant Engagement Plan, and formalizing partnership with the largest local HBCU, etc. The identified progresses since 2017 confirm that these new programs are effective. During the past 4 years, new bibliometric measures, evaluation indicators, and applications have been developed and introduced to the CTSA communities, introducing additional perspectives to examine CTSA research performance, especially at the individual program hub level. It is important to keep tracking and measuring the success and impact of CTSA-supported translational science by using a growing set of metrics and tools.

Our finding has several implications. First, inclusion of collaboration network analysis allowed us to see where the growth in volume is happening. For example, we are seeing that there are more UNC units being represented in NC TraCS-supported publications. As evaluators, we can then assess if the new UNC units are same units where the programs have focused their efforts. Similarly, we have partnered with NC A&T University over the past 4 years to increase their research productivity and NIH funding portfolio by providing direct services. We expect an increase in the number of publications with coauthors from NC A&T University overtime.

Secondly, researchers and CTSA institutions are increasingly using social media (e.g., Twitter) to promote and broadly disseminate their research outputs so that more people can benefit from their scientific discovery that is publicly funded. It is therefore becoming important to consider altmetrics as a complement to the citation-based measurement for research impact.

Thirdly, with the focus on increasing translation of findings among CTSAs, it is vital for CTSA evaluators to understand the translational potential of supported research. Therefore, we included the STTP and iCite’s Translational Module to ensure that our CTSA-supported translational research was diverse in scope, addressed a wide spectrum of translational categories, and had a high translational potential.

Fourthly, we understand that the commercial application and database subscriptions are too diverse across CTSA institutions to standardize one or two methods to assess research productivity. Thus, the approach we have taken allows an institution to tailor their approach to assessing the growth and impact of its CTSA efforts.

Finally, the mixed approach and findings helped NC TraCS build program foci. Our evaluation team meets with the CTSA leadership regularly to discuss findings, inform strategic direction of initiatives (e.g., pilot award funding, team science opportunities), and increase our focus on community engagement and supporting projects that enhance health equity and areas that are of high priority to the CTSA in our current portfolio of publications.

Conclusion

We expanded our pilot study from 2017 and adopted a mixed-metrics approach (bibliometrics, SNA, and altmetrics) to evaluate the research impact of a CTSA program. We disclosed the changes in research productivity, citation impact, and research collaborations. We assessed the CTSA-supported research topics in prominence, extent to which health disparity is addressed, and potential-to-translate to improve human health. We also observed a positive correlation between citation measures and altmetrics of CTSA-supported publications. We suggest researchers and institutions utilize social media to disseminate their research output to the public widely. Lastly, we would like to encourage other CTSA programs to take a similar mixed-metrics approach to monitor and assess their programs over time and share their processes and experiences with the CTSA community so that we can advance translational science evaluation together.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/cts.2022.530.

Acknowledgments

The project described was supported by the National Center for Advancing Translational Sciences (NCATS), National Institutes of Health (NIH), through Grant Award Number UL1TR002489. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

Disclosures

The authors have no conflicts of interest to declare.

References

National Center for Advancing Translational Sciences. Translational Science Spectrum [Cited June 7, 2021]. (https://ncats.nih.gov/translation/spectrum)Google Scholar
National Center for Advancing Translational Sciences. CTSA Program Hubs [Cited June 9, 2021]. (https://ncats.nih.gov/ctsa/about/hubs)Google Scholar
Committee to Review the Clinical and Translational Science Awards Program at the National Center for Advancing Translational Sciences, Board on Health Sciences Policy, Institute of Medicine. The CTSA Program at NIH: Opportunities for Advancing Clinical and Translational Research (Leshner AI, Terry SF, Schultz AM, Liverman CT, eds.). In: Washington (DC): National Academies Press (US); 2013, DOI: 10.17226/18323.CrossRefGoogle Scholar
Trochim, WM, Rubio, DM, Thomas, VG. Evaluation Key Function Committee of the CTSA Consortium. Evaluation guidelines for the Clinical and Translational Science Awards (CTSAs). Clinical and Translational Science 2013; 6(4): 303309. DOI: 10.1111/cts.12036.CrossRefGoogle ScholarPubMed
Rubio, DM, Blank, AE, Dozier, A, et al. Developing common metrics for the clinical and translational science awards (ctsas): lessons learned. Clinical and Translational Science 2015; 8(5): 451459. DOI: 10.1111/cts.12296.CrossRefGoogle ScholarPubMed
Institute of Clinical and Translational Sciences at Washington University in St. Louis. Evaluation [Cited June 7, 2021]. (https://icts.wustl.edu/impact/evaluation/)Google Scholar
Grazier, KL, Trochim, WM, Dilts, DM, Kirk, R. Estimating return on investment in translational research: methods and protocols. Evaluation & the Health Professions 2013; 36(4): 478491. DOI: 10.1177/0163278713499587.CrossRefGoogle ScholarPubMed
Zhang, G, Zeller, N, Griffith, R, et al. Using the context, input, process, and product evaluation model (CIPP) as a comprehensive framework to guide the planning, implementation, and assessment of service-learning programs. Journal of Higher Education Outreach and Engagement 2011; 15: 28.Google Scholar
Rollins, L, Llewellyn, N, Ngaiza, M, Nehl, E, Carter, DR, Sands, JM. Using the payback framework to evaluate the outcomes of pilot projects supported by the Georgia Clinical and Translational Science Alliance. Journal of Clinical and Translational Science 2020; 5(1): e48. DOI: 10.1017/cts.2020.542.Google ScholarPubMed
Wooten, KC, Rose, RM, Ostir, GV, Calhoun, WJ, Ameredes, BT, Brasier, AR. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach. Evaluation & the Health Professions 2014; 37(1): 3349. DOI: 10.1177/0163278713504433.CrossRefGoogle ScholarPubMed
Yu, F, Van, AA, Patel, T, et al. Bibliometrics approach to evaluating the research impact of CTSAs: a pilot study. Journal of Clinical and Translational Science 2020; 4(4): 336344. DOI: 10.1017/cts.2020.29.CrossRefGoogle ScholarPubMed
Llewellyn, N, Carter, DR, DiazGranados, D, Pelfrey, C, Rollins, L, Nehl, EJ. Scope, influence, and interdisciplinary collaboration: the publication portfolio of the NIH clinical and translational science awards (CTSA) program from 2006 through 2017. Evaluation & the Health Professions 2020; 43(3): 169179. DOI: 10.1177/0163278719839435.CrossRefGoogle Scholar
Llewellyn, N, Carter, DR, Rollins, L, Nehl, EJ. Charting the publication and citation impact of the NIH clinical and translational science awards (CTSA) program from 2006 through 2016. Academic Medicine 2018; 93(8): 11621170. DOI: 10.1097/ACM.0000000000002119.Google ScholarPubMed
Schneider, M, Kane, CM, Rainwater, J, et al. Feasibility of common bibliometrics in evaluating translational science. Journal of Clinical and Translational Science 2017; 1(1): 4552. DOI: 10.1017/cts.2016.8.CrossRefGoogle ScholarPubMed
Sayavedra, N, Hogle, JA, Moberg, DP. Using publication data to evaluate a Clinical and Translational Science Award (CTSA) career development program: early outcomes from KL2 scholars. Journal of Clinical and Translational Science 2017; 1(6): 352360. DOI: 10.1017/cts.2018.1.CrossRefGoogle ScholarPubMed
Bian, J, Xie, M, Topaloglu, U, Hudson, T, Eswaran, H, Hogan, W. Social network analysis of biomedical research collaboration networks in a CTSA institution. Journal of Biomedical Informatics 2014; 52: 130140. DOI: 10.1016/j.jbi.2014.01.015.CrossRefGoogle Scholar
Nagarajan, R, Peterson, CA, Lowe, JS, Wyatt, SW, Tracy, TS, Kern, PA. Social network analysis to assess the impact of the CTSA on biomedical research grant collaboration. Clinical and Translational Science 2015; 8(2): 150154. DOI: 10.1111/cts.12247.CrossRefGoogle ScholarPubMed
Vacca, R, McCarty, C, Conlon, M, Nelson, DR. Designing a CTSA-based social network intervention to foster cross-disciplinary team science. Clinical and Translational Science 2015; 8(4): 281289. DOI: 10.1111/cts.12267.CrossRefGoogle ScholarPubMed
Milat, AJ, Bauman, AE, Redman, S. A narrative review of research impact assessment models and methods. Health Research Policy and Systems 2015; 13(1): 18. DOI: 10.1186/s12961-015-0003-1.CrossRefGoogle ScholarPubMed
Frechtling, J, Raue, K, Michie, J, Miyaoka, A, Spiegelman, M. The CTSA National Evaluation Final Report, 2012.Google Scholar
Qua, K, Yu, F, Patel, T, Dave, G, Cornelius, K, Pelfrey, CM. Scholarly productivity evaluation of KL2 scholars using bibliometrics and federal follow-on funding: cross-institution study. Journal of Medical Internet Research 2021; 23(9): e29239. DOI: 10.2196/29239.CrossRefGoogle ScholarPubMed
Bian, J, Xie, M, Hudson, TJ, et al. CollaborationViz: interactive visual exploration of biomedical research collaboration networks. Plos One 2014; 9(11): e111928. DOI: 10.1371/journal.pone.0111928.CrossRefGoogle ScholarPubMed
Nagarajan, R, Lowery, CL, Hogan, WR. Temporal evolution of biomedical research grant collaborations across multiple scales -- a CTSA baseline study. AMIA Annual Symposium Proceedings 2011; 2011: 987993.Google Scholar
Luke, DA, Carothers, BJ, Dhand, A, et al. Breaking down silos: mapping growth of cross-disciplinary collaboration in a translational science initiative. Clinical and Translational Science 2015; 8(2): 143149. DOI: 10.1111/cts.12248.CrossRefGoogle Scholar
Sorensen, AA, Seary, A, Riopelle, K. Alzheimer’s disease research: a COIN study using co-authorship network analytics. Procedia - Social and Behavioral Sciences 2010; 2(4): 65826586. DOI: 10.1016/j.sbspro.2010.04.068.CrossRefGoogle Scholar
Dozier, AM, Martina, CA, O’Dell, NL, et al. Identifying emerging research collaborations and networks: method development. Evaluation & the Health Professions 2014; 37(1): 1932. DOI: 10.1177/0163278713501693.CrossRefGoogle ScholarPubMed
DORA. San Francisco declaration on research assessment [Cited June 11 2021]. (https://sfdora.org/read/)Google Scholar
Akers, KG. Introducing altmetrics to the Journal of the Medical Library Association. Journal of the Medical Library Association 2017; 105(3): 213215. DOI: 10.5195/jmla.2017.250.CrossRefGoogle Scholar
Giustini, AJ, Axelrod, DM, Lucas, BP, Schroeder, AR. Association between citations, altmetrics, and article views in pediatric research. JAMA Network Open 2020; 3(7): e2010784. DOI: 10.1001/jamanetworkopen.2020.10784.CrossRefGoogle ScholarPubMed
Punia, V, Aggarwal, V, Honomichl, R, Rayi, A. Comparison of attention for neurological research on social media vs academia: an altmetric score analysis. JAMA Neurology 2019; 76(9): 11221124. DOI: 10.1001/jamaneurol.2019.1791.CrossRefGoogle ScholarPubMed
Maggio, LA, Leroux, TC, Meyer, HS, Artino, AR. #MedEd: exploring the relationship between altmetrics and traditional measures of dissemination in health professions education. Perspectives on Medical Education 2018; 7(4): 239247. DOI: 10.1007/s40037-018-0438-5.CrossRefGoogle ScholarPubMed
Kunze, KN, Polce, EM, Vadhera, A, et al. What is the predictive ability and academic impact of the altmetrics score and social media attention? The American Journal of Sports Medicine 2020; 48(5): 10561062. DOI: 10.1177/0363546520903703.CrossRefGoogle ScholarPubMed
Luc, JGY, Archer, MA, Arora, RC, et al. Does tweeting improve citations? One-year results from the TSSMN prospective randomized trial. The Annals of Thoracic Surgery 2021; 111(1): 296300. DOI: 10.1016/j.athoracsur.2020.04.065.CrossRefGoogle ScholarPubMed
Llewellyn, NM, Weber, AA, Fitzpatrick, AM, Nehl, EJ. Big splashes & ripple effects: a narrative review of the short- & long-term impact of publications supported by an NIH CTSA pediatrics program. Translational Pediatrics 2022; 11(3): 411422. DOI: 10.21037/tp-21-506.CrossRefGoogle ScholarPubMed
Llewellyn, N, Weber, A, Nehl, E. Making Waves: The Impact of the Georgia CTSA Publication Portfolio from 2007-2021 Big Splashes and Ripple Effects on Translation (2021 Internal Report). Georgia Clinical & Translational Alliance, 2021.Google Scholar
Pranckutė, R. Web of Science (wos) and scopus: The titans of bibliographic information in today’s academic world. Publications 2021; 9(1): 12. DOI: 10.3390/publications9010012.CrossRefGoogle Scholar
Elsevier Solutions. Topic Prominence in Science - Scival [Cited May 27, 2021]. (https://www.elsevier.com/solutions/scival/features/topic-prominence-in-science)Google Scholar
Elsevier. About PlumX Metrics - Plum Analytics [Cited June 10, 2021]. (https://plumanalytics.com/learn/about-metrics/)Google Scholar
Web Scraper. Web Scraper - The #1 web scraping extension [Cited June 1, 2021]. (https://webscraper.io/)Google Scholar
Elsevier Scopus Blog. PlumX Metrics API Now Available for Scopus Subscribers [Cited June 12, 2021]. (https://blog.scopus.com/posts/plumx-metrics-api-now-available-for-scopus-subscribers)Google Scholar
NIH Office of Portfolio Analysis. iCite [Cited March 27, 2021]. (https://icite.od.nih.gov/analysis)Google Scholar
Hutchins, BI, Yuan, X, Anderson, JM, Santangelo, GM. Relative citation ratio (RCR): a new metric that uses citation rates to measure influence at the article level. PLoS Biology 2016; 14(9): e1002541. DOI: 10.1371/journal.pbio.1002541.CrossRefGoogle ScholarPubMed
Van Eck, NJ, Waltman, L. VOSviewer. [Internet], 2021 [Cited June 29, 2021]. (https://www.vosviewer.com/)Google Scholar
Centre for Science and Technology Studies (CWTS). Publications that applied VOSviewer VOSviewer [Internet], [Cited June 10, 2021]. (https://www.vosviewer.com/publications#applied-publications)Google Scholar
Ortega, JL. Reliability and accuracy of altmetric providers: a comparison among Altmetric.com, PlumX and Crossref Event Data. Scientometrics 2018; 116(3): 21232138. DOI: 10.1007/s11192-018-2838-z.CrossRefGoogle Scholar
Richardson, MA, Park, W, Echternacht, SR, Bell, DE. Altmetric attention score: evaluating the social media impact of burn research. Journal of Burn Care & Research 2021; 42(6): 11811185. DOI: 10.1093/jbcr/irab026.CrossRefGoogle ScholarPubMed
Asaad, M, Howell, SM, Rajesh, A, Meaike, J, Tran, NV. Altmetrics in plastic surgery journals: does it correlate with citation count? Aesthetic Surgery Journal / the American Society for Aesthetic Plastic surgery 2020; 40(11): NP628NP635. DOI: 10.1093/asj/sjaa158.CrossRefGoogle ScholarPubMed
Warren, VT, Patel, B, Boyd, CJ. Analyzing the relationship between Altmetric score and literature citations in the Implantology literature. Clinical Implant Dentistry and Related Research 2020; 22(1): 5458. DOI: 10.1111/cid.12876.CrossRefGoogle ScholarPubMed
Llewellyn, NM, Nehl, EJ. Predicting citation impact from altmetric attention in clinical and translational research: do big splashes lead to ripple effects? Clinical and Translational Science 2022; 15(6): 13871392. DOI: 10.1111/cts.13251.CrossRefGoogle ScholarPubMed
IBM Corp. IBM SPSS Statistics. Armonk, NY: IBM Corp, 2020.Google Scholar
Figure 0

Table 1. Data measures, categories, metrics, sources, and analysis tools

Figure 1

Fig. 1. The STPP distribution of NC TraCS-supported publications (September 2008–March 2021) (N = 1,115) (Note: NC TraCS = North Carolina Translational and Clinical Institute; STPP = SciVal Topic Prominence Percentile; 90–99 percentile (N = 730; 80–89 percentile (N = 217); 70–79 percentile (N = 66); 50–69 percentile (N = 61); 0–49 percentile (N = 36); No data (N = 5)).

Figure 2

Table 2. The comparison of research productivity and citation impact (2017 vs. 2021)

Figure 3

Fig. 2. High article density of NC TraCS-supported research on human health (N = 1,154) (Note: NC TraCS, North Carolina Translational and Clinical Institute; iCite Translational Module generated the visualization).

Figure 4

Table 3. The comparison of research collaboration (2017 vs. 2021)

Figure 5

Table 4. The comparison of research collaboration in North Carolina (2017 pilot vs. 2021) (numbers below represent coauthored publications)

Figure 6

Fig. 3. MeSH term co-occurrence network map (N = 177 publications; 72 MeSH terms with co-occurrence >5 times) (Note: MeSH, Medical Subject Heading).

Supplementary material: File

Yu et al. supplementary material

Figures S1-S3 and Tables S1-S4

Download Yu et al. supplementary material(File)
File 726 KB