Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-27T11:26:02.096Z Has data issue: false hasContentIssue false

Strategic policy options to improve quality and productivity of biomedical research

Published online by Cambridge University Press:  12 November 2024

E. Andrew Balas*
Affiliation:
Biomedical Research Innovation Laboratory, Augusta University, Augusta, GA, USA
Gianluca De Leo
Affiliation:
Biomedical Research Innovation Laboratory, Augusta University, Augusta, GA, USA Department of Health Management, Economics and Policy, Augusta University, Augusta, GA, USA
Kelly B. Shaw
Affiliation:
Department of Political Science, Iowa State University, Ames, IA, USA
*
Corresponding author: E. Andrew Balas; Email: [email protected]

Abstract

Emerging societal expectations from biomedical research and intensifying international scientific competition are becoming existential matters. Based on a review of pertinent evidence, this article analyzes challenges and formulates public policy recommendations for improving productivity and impact of life sciences. Critical risks include widespread quality defects of research, particularly non-reproducible results, and narrow access to scientifically sound information giving advantage to health misinformation. In funding life sciences, the simultaneous shift to nondemocratic societies is an added challenge. Simply spending more on research will not be enough in the global competition. Considering the pacesetter role of the federal government, five national policy recommendations are put forward: (i) funding projects with comprehensive expectations of reproducibility; (ii) public–private partnerships for contemporaneous quality support in laboratories; (iii) making research institutions accountable for quality control; (iv) supporting new quality filtering standards for scientific journals and repositories, and (v) establishing a new network of centers for scientific health communications.

Type
Perspective Essay
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of The Association for Politics and the Life Sciences

Introduction

Research in life sciences has produced many astonishing discoveries leading to major improvements in public health as well as economic progress. Meanwhile, the outstanding discoveries are coming from a growing stream of publications that also carry large amounts of marginal and questionable results. Considering the major societal interest in greater scientific progress, it is important to look into the driving forces of change.

The partnership of democracy and scientific progress is facing unprecedented challenges

The history of modern science shows that freedom and democracy represent the fertile ground for productive research and resulting major scientific discoveries. Not just vast amounts of published research results but also 98% of Nobel prize-winning discoveries come from highly developed economies of democratic societies (Table 1).

Table 1. Country affiliation of 21st century STEM Nobel laureates at the time of the award (Nobel Prize Organization, 2023)

When democracy is absent or destroyed, scientific research is also impaired. Until 1933, Germany was the number one producer of Nobel Prize-winning discoveries, but after the Nazi rise to power, there was a dramatic shift of research productivity to democracies, primarily to the United States. Apparently, wide-ranging, successful scientific progress cannot flourish on the narrow understanding of authoritarians.

Dictatorships can concentrate unparalleled resources for accelerated progress in prioritized promising fields. The jet engine development in Nazi Germany and the remarkably successful early space program of the Soviet Union are some of these examples. However, the full breath of scientific progress, game-changing role of accidental discoveries, and breathtaking progress in unexpected areas are more clearly happening in democracies.

Scientific progress is not just the product of freedom and democracy but also the primary defender of these values. Scientific discoveries and subsequent technological achievements are essential for not only economic progress but also serve as the vital resources for democracy. A recent article in Nature pointed out that the concept of “arsenal of democracy” remains as relevant now as it was 83 years ago, but now it relies on data, analytics, and many other innovations (Janicke & Brown, Reference Janicke and Brown2022). Therefore, increased investment should drive production, research, and innovation.

Although the physical, chemical, and information sciences have received the most attention in recent decades, it is broadly anticipated that biological and life sciences will play an increasingly influential role in the future. The recently launched Congressional National Security Commission on Emerging Biotechnology recognizes the momentous changes in this field. Accelerating progress of life sciences should be among the preeminent strategic priorities in democratic societies.

Widespread deficiencies of research quality and non-reproducible results

In recent decades, biomedical research has produced many major discoveries and public health improvements but there are also alarming reports as to the ineffectiveness of the research enterprise.

Table 2 summarizes several of the highest impact scholarly studies presenting evidence on the pervasive quality defects and non-reproducibility in biomedical research. The results of these publications have been corroborated by many synergistic studies. To achieve improvement in these deficiencies, four policy recommendations are directly addressing points of intervention at the level of research funding, institutional action, laboratory level action, and publication phase quality filtering are addressed and advanced below.

Table 2. Major quality deficiencies and their estimated frequencies in biomedical research

Of course, research hypotheses often turn out to be incorrect, but that is normal reflection of our understanding of nature, definitely not an error. Especially, in basic research, one may not always know in the beginning how initial hypotheses will be confirmed or denied at the end. Surely, when the hypothesis is logical based on what we know but turns out to be false in the experiment that is not a quality deficiency. Nobel Laureate John Gurdon’s frog experiment with the unique gene illustrates the value of unexpected “non-reproducibility” that became the starting point of a valuable discovery.

On the other hand, complexities of the life sciences research process make it prone to deficiencies. Many research projects show signs of fatal but completely avoidable deficiencies like corrupted reagents or gender biased samples. A rapidly growing number of studies identified the various causes of quality defects that threaten reproducibility, credibility, and rigor of biomedical research. As Table 2 summarizes, quality defects in biomedical research are ubiquitous and originate from a multitude of sources.

Publishing non-reproducible results (i.e., giving the appearance that the results are reproducible) is a serious quality defect that must be avoided. Such non-reproducible research results represent a significant threat not only to the integrity of science but also to effective implementation for health care improvement derailing industry progress and causing vast economic losses.

Based on abundant evidence, quality defects invalidate more than half of scientific research and publication. However, this major shortcoming is still underestimated by our society and often overlooked by the research community. Although systematic quality improvement efforts are lacking, measuring the prevalence of quality deficiencies is gradually becoming more frequent.

The societal impact of research defects

The severity and frequency of errors in life science research are sources of a multitude of societal and public health harms. Experiments can lead to negative but valuable and publishable results. However, major research defects make any kind of study useless, regardless of positive or negative outcome. The harms go far beyond missed opportunities for beneficial discoveries, new treatments for major diseases, better understanding of nature, or improvements in overall wellness and life expectancy.

Most noticeably, useless participation and sacrifice of study participants raise significant ethical concerns. A study showed that 29% of registered clinical trials remain unpublished, and these trials had an estimated total enrollment of nearly 300,000 participants (Jones et al., Reference Jones, Handler and Crowell2013). Adding to this number, the published but flawed trials in the range of 32–53% (Balas et al., Reference Balas, Bussi, Asem, Amour, Mwanziva, Vazquez, Labib, Price, Mahande, Baskar, Dhantu, Townsend and Clément Aubert2024), participation in defective trials probably impacts twice as many people who are unnecessarily exposed to varying levels of risks and inconveniences.

Publication of untrustworthy clinical trial results can also mislead subsequent clinical practice guidelines and degrade the effectiveness of health care. For example, it was recommended that tranexamic acid should be given preventively to everyone undergoing a caesarean based on the results of 36 clinical trials. Later, a large US-led trial with 11,000 people reported no statistically significant benefits (Pacheco et al., Reference Pacheco, Clifton, Saade, Weiner, Parry, Thorp and Macones2023). Moreover, it turned out that many of the originally analyzed 36 clinical trials were untrustworthy (Van Noorden, Reference Van Noorden2023).

Defective biomedical research can also drain and mislead subsequent scientific studies. A survey of cancer researchers indicated that 50% of respondents had experienced at least one episode of inability to reproduce published data (Mobley, Reference Mobley, Linder, Braeuer, Ellis and Zwelling2013). According to an analysis, 788 retracted English-language papers were further cited over 5000 times by other researchers and over 70,501 patients were enrolled in 851 secondary studies citing the retracted papers.

The expenses associated with wasted research are also astounding. It is estimated that unreliable preclinical research generates direct costs nearly $28 billion annually in the United States alone (Freedman et al., Reference Freedman, Cockburn and Simcoe2015). For example, more than a hundred million animals are used in laboratory experiments every year (i.e., frogs, mice, rats, hamsters, dogs, etc.) The ratio of useless sacrifice of animals is probably more frequent than fruitless participation in human experiments (Kilkenny et al., Reference Kilkenny, Parsons, Kadyszewski, Festing, Cuthill, Fry, Hutton and Altman2009). Adding to the losses of unreliable preclinical research projects, the costs of non-reproducible clinical research studies further increase the financial losses.

Tectonic shift in the production of life sciences

When the number of research publications is the measure, a definite change can be observed in scientific production on the global scale. The NIH National Library of Medicine applies rigorous criteria for the selection of biomedical research journals to be indexed in its PubMed Medline database. In recent years, a major shift can be observed in the production of Medline indexed scientific articles based on the institutional affiliation and country location of the first author (Table 3 based on the PubMed Database, 2023). The illustrative Medline term “stroke” shows that not only the volume of articles is growing but the sources of such articles are also rapidly shifting. Considering the benefits of cultural diversity in scientific investigations, the essential role of independent replications, and the need to meet needs in many more countries, global diversification of the scientific enterprise is a desirable trend but also has competitive implications.

Table 3. Shifting sources of research production: “Stroke”[MeSH Term] more than 50 articles

Obviously, the number of research publications cannot be equated with innovation resulting from scientific discoveries. The number of articles on COVID-19 and the vaccines developed by the same country as registered by the World Health Organization show major discrepancies. Several European and North American countries have been sources of comparatively fewer publications but still highly successful in vaccine development, particularly mRNA type vaccines. Meanwhile, China and Japan have been very active in publishing research on COVID but much less successful in vaccine development.

These and other discrepancies should further highlight the quality attribute of research that is essential for success but not captured by simply counting the number of published scientific articles. Meaningful innovation in key technological areas requires important but less obvious qualities, far beyond any bean counting of research production.

Scientific evidence is out of sight while misinformation puts lives at risk

In the midst of research production challenges, the general public or taxpayers lack meaningful access to the latest and best scientific evidence. With the exception of publications behind paywalls, competent health professionals and researchers can find relevant scientific information. However, obscure websites, nonpractical search engines, highly technical language, and inconsistencies of scientific reporting make essential information largely inaccessible to the general public. Health misinformation is spreading fast and easily but scientifically sound information is often difficult to find and hard to understand.

In spite of good science being the key to better health, it is often pushed aside and crowded out by misinformation on social media and other popular sources of information. Health misinformation is defined as a health-related claim that is based on anecdotal evidence, false, or misleading owing to the disregard for existing scientific knowledge (Chou et al., Reference Chou, Oh and Klein2018).

Receiving harmful health misinformation became a frequent experience for most internet and social media users (Office of the Surgeon General, 2021; Wang et al., Reference Wang, McKee, Torbica and Stuckler2019). The examples are almost endless: anorexia popularized as fashion and ideal beauty, Zika virus portrayed as a bioweapon, misleading portrayal of health effects of tobacco to generate positive image of smoking, fraudulent linkage of MMR vaccine to autism, dubious and unsubstantiated “treatments” for cancer, diabetes, heart disease, and others (Table 4). Occasionally celebrities add credibility to worthless or harmful health misinformation.

Table 4. Major sources and risks of health misinformation

Apparently, false and misleading health information spreads more easily than scientific knowledge through social media (Vosoughi et al., Reference Vosoughi, Roy and Aral2018). False news stories were 70% more likely to be shared on social media than accurate information. According to an analysis of health misinformation on social media, the frequency was the highest on Twitter and on issues of smoking products and drugs with vaccines and major diseases following (Suarez-Liedo, Reference Suarez-Lledo and Alvarez-Galvez2021). Misinformation is often more popular than factual messages (Wang et al., Reference Wang, McKee, Torbica and Stuckler2019). The rise of false information has created an urgent threat and it is literally putting lives at risk.

Under the NIH Public Access Policy, there is open access to published results of NIH-funded research at the NIH NLM PMC website, a free digital archive (NIH, 2024). NIH-funded investigators are required to submit to PMC an electronic version of the final, peer-reviewed manuscript upon acceptance for publication. In the prevailing absence of plain language summaries, even openly available scientific literature has major limitations in supporting access by the general public.

Simply spending more on research will not be enough in the global competition

The common public policy response to the need for more research and more discoveries is more funding. The call for more funding comes not only from the research community but also from the general public and political leaders. A 2022 survey commissioned by Research!America found that more than 9 in 10 Americans (92%) agree investing in research is important to finding new ways of preventing, treating, and curing illnesses.

Indeed, the budget of the NIH was doubled by Congress between 1998 and 2003. The funding increases have led to more research and expansion of the research enterprise with some variability over time. In return, more investment in science increased the number of valuable results and strengthened competitiveness. According to Azoulay et al. (Reference Azoulay, Graff Zivin, Li and Sampat2019), for every $10 million of funding, NIH-supported research has generated a net increase of 2.7 patents. Furthermore, NIH-funded articles have greater journal impact factors than non-NIH-funded articles (5.76 versus 3.71, Lyubarova et al., Reference Lyubarova, Itagaki and Itagaki2009).

According to the NSF, the share of global R&D performed by the United States declined from 29% to 27%, whereas the share by China increased from 15% to 22% between 2010 and 2019 (Burke et al., Reference Burke, Okrent, Hale and Gough2022). The Chinese economy is projected to overtake the US economy in 2030 (Jenkins, Reference Jenkins2022). Research and development expenditures as percentage of GDP were 2.40% in China and 3.45% in the United States in 2020, with the former growing more rapidly (Zhou, Reference Zhou and Dahal2024).

Without major adjustments in research growth strategies, any country relying only on the spending model of growing research will inevitably end up in a second-class role. “The Chinese economy is probably going to be at least twice as big as the US’ economy, maybe three times,” summed up Elon Musk at the Air Warfare Symposium in Orlando, Florida in 2020. “If you’re not innovative, you’re going to lose.” The only way to compete is improvement in quality, effectiveness, productivity, and overall innovativeness of scientific research.

It should also be noted that the rigid reliance on the traditional peer review system and the usual university promotion and tenure measures of scientific productivity limit improvements of research quality. For example, the serious limitations of peer review have multiple components: it is based only on inspection of the final product, the scientific manuscript. Just like auto manufacturing quality cannot be accurately judged or improved based on inspection in the dealer’s showroom, peer review is also limited. Submitted manuscripts have only what the authors want to communicate, and often essential details and underlying data are missing. Most reviewers are overloaded, and the review process itself is almost entirely voluntary.

There have been several meritorious attempts to go beyond publications and define a fuller range of scientific products in assessing the productivity of research (e.g., Bernard Becker, 2014). However, none of these efforts have been successful in changing the mainstream of research productivity evaluations.

While each of the above described deficiencies of the biomedical research enterprise deserves many more studies and focused actions, the already accumulating evidence is more than enough to urge formulation of national policies that can not only advance relevant studies but also likely to improve the quality and availability of research results. What is becoming obvious is that the overwhelming majority of quality defects are produced in the research process prior to publication. Therefore, the focus should remain on improving the research process itself.

Lines of policy actions to improve quality and productivity of research

In response to the recognition of quality deficiencies in research, numerous national and local efforts have been initiated. Some involved addressing the elected aspects of quality and rigor (e.g., gender representation in samples, reagent authentication). Others include monitoring of retractions in the scientific literature. It is also hoped that the launching of Advanced Research Projects Agency for Health will bring new energies into the production of impactful research and development activities. Although these and other efforts are meritorious, they are by no means sufficient to address the multitude of quality and productivity problems in the research enterprise.

Effectiveness of the entire life sciences research enterprise must be regularly examined and improved continuously. Occasionally, counter-intuitive but very effective legislative changes can make a huge difference as exemplified by the highly successful Bayh-Dole Act in making intellectual property from university research available for technology innovation (Mowery et al., Reference Mowery, Nelson, Sampat and Ziedonis2001). The following is a list of national policy actions in major directions of research quality improvement:

Support research with higher expectations of quality and reproducibility

Continued funding increases that exceed inflation remain essential and also promise good return on investment. The calls are multiplying that funders should be clearer in expecting quality and reproducible research results (Moher et al., Reference Moher, Glasziou, Chalmers, Nasser, Bossuyt, Korevaar and Boutron2016).

While studies showed that spending on research generates considerable economic activity and therefore contributes to economic growth but the results should be trustworthy and reproducible (Macilwain, Reference Macilwain2010). To provide a better foundation for the development of new technologies and greater economic activity, research project solicitations, requests for applications and requests for proposals, should set not only research priorities but also research quality expectations, including quality control, waste reduction, and reuse of results (Moher et al., Reference Moher, Glasziou, Chalmers, Nasser, Bossuyt, Korevaar and Boutron2016). Federal funding is also needed for research effectiveness studies, including but not limited to science of science studies on prevention of non-reproducibility and development of practically applicable research results.

At the federal level, there should be an Office of Research Quality and Participant Protection to comprehensively monitor quality control, promote availability of results, and protect the value of investment in scientific research. This could be somewhat analogous with the quality control of health care by the Center for Clinical Standards and Quality in the Medicare and Medicaid programs (CMS, 2024). The Office should present an annual report to Congress on value, reproducibility, and effectiveness of federally funded research, develop recommendation to improve quality, and oversee the representation of interests of research subjects. This Office should be separate from the Office of Research Integrity that oversees and directs Public Health Service research integrity activities on behalf of the Secretary of Health and Human Services. The problem of research misconduct should be kept separate from the genuine and comprehensive quality improvement efforts.

Public–private partnerships for contemporaneous quality control support in research laboratories

The epicenter of research innovation is the creative researcher working in the laboratory. Several authors have directed attention to the central importance of laboratories in reducing waste from biomedical research (Ioannidis et al., Reference Ioannidis, Greenland, Hlatky, Khoury, Macleod, Moher and Tibshirani2014; Stroth, Reference Stroth2016); Obviously, it is essential to improve research quality and reporting at the time of production rather than afterward. It would be important to engage basic and clinical scientists, including early-stage researchers, in the discussions about quality control measures. There is a great need to gain a better understanding of correctable errors, including those that are unrecognizable in the scientific review process of final reports. In partnership with the private sector, the federal government should support the development of systems and services for on-the-go quality support in biomedical research laboratories.

Experience in several industries shows that quality cannot be meaningfully assessed just by inspecting the final product. For example, the Food and Drug Administration that is responsible for assuring the quality and safety of new treatments requires not only the presentation of product samples and clinical test results but also access to the manufacturing facilities to make sure that the production process maintains quality and delivers consistent product (FDA, 2024). Nothing like that exists in the research enterprise. Consequently, the entire research documentation process should be reformed by not just making it more transparent but also smarter in terms of guiding advising researchers about the steps to follow or avoid at various important decision points of biomedical research.

Make research institutions accountable for production quality and reproducibility

Research universities have a significant responsibility for maintaining excellence in research quality and output. Consequently, they need to foster an environment that supports scholarly work, research faculty creativity, and the opportunity to conduct advanced projects (Vernon et al., Reference Vernon, Balas and Momani2018). Among others, core research instrumentation services, like electron microscopes, mass spectrometers, and NMR machines, are pivotal components of a university’s research infrastructure, enabling groundbreaking science by providing access not only to cutting-edge technologies but also quality control and collaborative environments that drive innovation and knowledge creation.

Institutions that receive significant federal support for facilities and administration expenses should take ownership of quality assessment and improvement efforts at the institutional level. Deficiencies, especially those leading to non-reproducible results are often hard to recognize and manage at the level of individual research laboratories. Although individual research projects may show many variations and exceptions, improving recruitment of patients and completion rates of randomized clinical trials need to be institutional priorities (Finkelstein et al., Reference Finkelstein, Brickman, Capron, Ford, Gombosev, Greene and Sugarman2015; Robishaw et al., Reference Robishaw, DeMets, Wood, Boiselle and Hennekens2020). Research institutions should be encouraged to assess the quality of intramural research, launch improvement initiatives, and use new metrics of societal impact and outcome assessment.

Increasing the practical impact of research urges shift in evaluation of scientific productivity toward more impact-oriented assessments of research. It is believed that fewer than half of faculty inventions with commercial potential are disclosed to their employing universities (Jensen et al., Reference Jensen, Thursby and Thursby2003). The San Francisco Declaration on Research Assessment, released by a group of editors and endorsed by more than 23,000 signatories in 161 countries, advocated for moving away from evaluating researchers using journal-based metrics toward recognition for data sets, software, and influence on policy (Pain, Reference Pain2023). The current system of research assessment that relies primarily on the number of publications and research grants received is no longer justifiable.

The procedural measures should be regularly supplemented and in some extent, replaced with outcome measures of practical impact that include, for example, licensable intellectual property and use of research results by practitioners (Balas & Elkin, Reference Balas and Elkin2013). Such demonstrations of beneficial research outcomes should by no means disadvantage curiosity-driven research as illustrated by most Nobel prize-winning discoveries of recent decades that led to large number of patents, new clinical services, changed practice guidelines, and many other important outcome improvements.

Expect stated quality filtering standards from scientific journals and repositories

Peer review of scientific reports at the time of publication is not the only step of quality control but remains an important milestone. The already emerging trend requiring simultaneous data publication and reliance on internationally accepted research standards, or collections of such standards like EQUATOR network, should be greatly expanded to increase transparency and improve quality control of the research process (Altman & Simera, Reference Altman and Simera2016). Eventually, reported research should become auditable by independent reviewers.

Several interventions have been proposed and studied to improve the quality of peer review in scientific publishing, among them providing training and guidelines for peer reviewers, utilizing checklists or structured review forms, implementing open peer review practices, and exploring post-publication peer review. The peer review process could be greatly strengthened with more structured quality assessments. Although some interventions have shown promise, more rigorous research is needed to establish evidence-based practices for improving the quality of prepublication review (Bruce et al., Reference Bruce, Chauvin, Trinquart, Ravaud and Boutron2016).

With the move toward open access and easy publication through the internet, the primary function of scientific publishing is protection of research integrity and quality control of research results. In indexing and acceptance, federal agencies and research funders should prioritize journals with acceptable quality control processes and should set standards for such filtering. Such efforts should cover not only the traditional scientific journals but also the rapidly growing scientific data repositories that are at risk of being inundated with questionable submissions (Husen et al., Reference Husen, de Wilde, de Waard and Cousijn2017). Journals and publishers of research results, including scientific data repositories, that disseminate federally funded research should publish regularly updated statements regarding quality expectations, quality control, preservation, and security measures.

Launching a network of centers for scientific health communications

The fight against health misinformation should start with better communication of sound science and evidence. It is not sufficient just to produce scientific evidence, but funders should spend more effort on getting it to the users and the public. Patients, families, and communities should be served with not just more science but also easy-to-understand communication of the most impactful discoveries. Currently, several institutions provide scientifically sound patient information, but these efforts do not receive federal support and cannot be fully comprehensive or up to date on the latest developments (e.g., Cleveland, 2024; Mayo, 2024).

Perhaps through a targeted grant program, federal investment is needed to launch and support scientific evidence centers serving various constituent communities. Obviously, no government agency can serve as the trusted source of plain language scientific information. However, federally funded centers could collect appropriately processed evidence and communicate plain language summaries to major patient groups and key communities. Multiple evidence centers could avoid the impression of government-mandated or endorsed scientific evidence.

These centers for scientific health communications should produce and disseminate plain language communications (FitzGibbon et al., Reference FitzGibbon, King, Piano, Wilk and Gaskarth2020; Rosenberg et al., Reference Rosenberg, Baróniková, Feighery, Gattrell, Olsen, Watson and Winchester2021). The plain language messages should be clear, nontechnical, and easily understandable descriptions of medical and scientific evidence. Graphical summaries, medium-complexity text summaries, and videos appear to be effective. The centers could serve, for example, K-12 education, major disease groups, local community leaders, employee wellness programs, and many others.

Apparently, many people like plain language lists as they can guide practical actions. To preserve connection with good science, it remains important to have links to at least one but preferably more peer-reviewed research publications with substantiating evidence. Of course, many action lists could include items coming from experience and not from research, but acknowledging them accordingly would add to the credibility of scientific communications.

In conclusion, improving the quality and productivity of biomedical research is becoming an ethical, scientific, health care effectiveness, public health, budgetary, and national security imperative. In many ways, improving productivity and effectiveness of the scientific enterprise, particularly life sciences, is the key to future societal progress, economic growth, and public health alike. Scientific research, technology innovation and effective production are the arsenal of democracy. The productive synergy between democracy and scientific research should not only be maintained but also strengthened in the interest of human progress and democratic societies.

Acknowledgments

The authors thank the team members of the Biomedical Research Innovation Laboratory for their helpful discussions.

Financial support

US National Institutes of Health (R01 GM146338-02).

Competing interest

The authors declare none.

References

Altman, D. G., & Simera, I. (2016). A history of the evolution of guidelines for reporting medical research: the long road to the EQUATOR Network. Journal of the Royal Society of Medicine, 109(2), 6777.CrossRefGoogle Scholar
Azoulay, P., Graff Zivin, J. S., Li, D., & Sampat, B. N. (2019). Public R&D investments and private-sector patenting: evidence from NIH funding rules. The Review of Economic Studies, 86(1), 117152.CrossRefGoogle ScholarPubMed
Balas, E. A., Bussi, S., Asem, N., Amour, C., Mwanziva, C., Vazquez, J., Labib, N. A., Price, M., Mahande, M. J., Baskar, R., Dhantu, S., Townsend, T. G., & Clément Aubert, C. (2024). FAIR reporting of clinical trials for public health practice. Proceedings of the European Academy of Sciences & Arts, 3, 19.CrossRefGoogle ScholarPubMed
Balas, E. A., & Elkin, P. L. (2013). Technology transfer from biomedical research to clinical practice: measuring innovation performance. Evaluation & the Health Professions, 36(4), 505517.CrossRefGoogle ScholarPubMed
Begley, C. G., & Ellis, L. M. (2012). Raise standards for preclinical cancer research. Nature, 483(7391), 531533.CrossRefGoogle ScholarPubMed
Bernard Becker Medical Library (2014). Becker Medical Library model for assessment of research impact. https://becker.wustl.edu/impact-assessment (accessed 3 January 2024).Google Scholar
Bruce, R., Chauvin, A., Trinquart, L., Ravaud, P., & Boutron, I. (2016). Impact of interventions to improve the quality of peer review of biomedical journals: A systematic review and meta-analysis. BMC Medicine, 14, 116.CrossRefGoogle ScholarPubMed
Burke, A., Okrent, A., Hale, K., & Gough, N. (2022). The State of US Science & Engineering 2022. National Science Board Science & Engineering Indicators. NSB-2022-1. National Science Foundation.Google Scholar
Chou, W. S., Oh, A., & Klein, W. M.P. (2018). Addressing health-related misinformation on social media. JAMA 320(23):24172418CrossRefGoogle ScholarPubMed
Cleveland Clinic (2024). Health Library. https://my.clevelandclinic.org/health.Google Scholar
CMS (2024). CMS Organizational Chart. https://www.cms.gov/about-cms/who-we-are/organizational-chart (Accessed 2 June 2024).Google Scholar
FDA (2024). Facts About the Current Good Manufacturing Practice (CGMP). https://www.fda.gov/drugs/development-approval-process-drugs/pharmaceutical-quality-resources (Accessed 1 June 2024).Google Scholar
Finkelstein, J. A., Brickman, A. L., Capron, A., Ford, D. E., Gombosev, A., Greene, S. M., … Sugarman, J. (2015). Oversight on the borderline: Quality improvement and pragmatic research. Clinical Trials, 12(5), 457466.CrossRefGoogle ScholarPubMed
FitzGibbon, H., King, K., Piano, C., Wilk, C., & Gaskarth, M. (2020). Where are biomedical research plain‐language summaries? Health Science Reports, 3(3), e175.CrossRefGoogle ScholarPubMed
Freedman, L. P., Cockburn, I. M., & Simcoe, T. S. (2015). The economics of reproducibility in preclinical research. PLoS Biology, 13(6), e1002165.CrossRefGoogle ScholarPubMed
Guidry, J. P., Carlyle, K., Messner, M., & Jin, Y. (2015). On pins and needles: How vaccines are portrayed on Pinterest. Vaccine, 33(39), 50515056.CrossRefGoogle ScholarPubMed
Husen, S., de Wilde, Z., de Waard, A., & Cousijn, H. (2017). Recommended versus certified repositories: Mind the gap. Data Science Journal, 16(42), 110 Available at SSRN 3020994.Google Scholar
Hutchinson, N., Moyer, H., Zarin, D. A., & Kimmelman, J. (2022). The proportion of randomized controlled trials that inform clinical practice. Elife, 11, e79491.CrossRefGoogle ScholarPubMed
Ioannidis, J. P., Greenland, S., Hlatky, M. A., Khoury, M. J., Macleod, M. R., Moher, D., … Tibshirani, R. (2014). Increasing value and reducing waste in research design, conduct, and analysis. The Lancet, 383(9912), 166175.CrossRefGoogle ScholarPubMed
Janicke, H., & Brown, A. L. (2022). Redefining the ‘arsenal of democracy’. Nature Human Behaviour, 6(6), 756756.CrossRefGoogle ScholarPubMed
Jenkins, R. (2022). How China is reshaping the global economy: Development impacts in Africa and Latin America. Oxford University Press.CrossRefGoogle Scholar
Jensen, R. A., Thursby, J. G., & Thursby, M. C. (2003). Disclosure and licensing of University inventions:‘The best we can do with the s** t we get to work with’. International Journal of Industrial Organization, 21(9), 12711300.CrossRefGoogle Scholar
Jones, C. W., Handler, L., & Crowell, K. E. (2013). Non-publication of large randomized clinical trials: Cross sectional analysis. BMJ, 347, f6104.CrossRefGoogle ScholarPubMed
Kilkenny, C., Parsons, N., Kadyszewski, E., Festing, M. F. W., Cuthill, I. C., Fry, D., Hutton, J., Altman, D. G. (2009). Survey of the quality of experimental design, statistical analysis and reporting of research using animals. PLoS One, 4, e7824.CrossRefGoogle Scholar
Lyubarova, R., Itagaki, B. K., & Itagaki, M. W. (2009). The impact of National Institutes of Health funding on US cardiovascular disease research. PLoS One, 4(7), e6425.CrossRefGoogle ScholarPubMed
Macilwain, C. (2010). What science is really worth: Spending on science is one of the best ways to generate jobs and economic growth, say research advocates. But as Colin Macilwain reports, the evidence behind such claims is patchy. Nature, 465(7299), 682685.CrossRefGoogle Scholar
Mansour, N. M., Balas, E. A., Yang, F. M., & Vernon, M. M. (2020). Prevalence and prevention of reproducibility deficiencies in life sciences research: Large-scale meta-analyses. Medical Science Monitor: International Medical Journal of Experimental and Clinical Research, 26, e922016e922011.CrossRefGoogle ScholarPubMed
Mayo Clinic (2024). Diseases & Conditions. https://www.mayoclinic.org/diseases-conditions.Google Scholar
Mobley, A., Linder, S. K., Braeuer, R., Ellis, L. M., & Zwelling, L. (2013). A survey on data reproducibility in cancer research provides insights into our limited ability to translate findings from the laboratory to the clinic. PloS One, 8(5), e63221.CrossRefGoogle ScholarPubMed
Moher, D., Glasziou, P., Chalmers, I., Nasser, M., Bossuyt, P. M., Korevaar, D. A., … Boutron, I. (2016). Increasing value and reducing waste in biomedical research: Who’s listening? The Lancet, 387(10027), 15731586.CrossRefGoogle ScholarPubMed
Mowery, D. C., Nelson, R. R., Sampat, B. N., & Ziedonis, A. A. (2001). The growth of patenting and licensing by US universities: An assessment of the effects of the Bayh–Dole act of 1980. Research Policy, 30(1), 99119.CrossRefGoogle Scholar
Nobel Prize Organization. (2023). https://www.nobelprize.org/ (Accessed 16 February 2023).Google Scholar
Office of the Surgeon General. (2021). Confronting Health Misinformation: The US Surgeon General’s Advisory on Building a Healthy Information Environment [Internet].Google Scholar
Pacheco, L. D., Clifton, R. G., Saade, G. R., Weiner, S. J., Parry, S., Thorp, J. M., … Macones, G. A. (2023). Tranexamic acid to prevent obstetrical hemorrhage after cesarean delivery. New England Journal of Medicine, 388(15), 13651375.CrossRefGoogle ScholarPubMed
Pain, E. (2023). How academia is exploring new approaches for evaluating researchers. Science. https://www.science.org/content/article/how-academia-is-exploringnew-approaches-for-evaluating-researchers (last accessed 25 July 2024)Google Scholar
Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug discovery, 10(9), 712712.CrossRefGoogle ScholarPubMed
PubMed Database (2023). https://pubmed.ncbi.nlm.nih.gov/ (Accessed 12 January 2023).Google Scholar
Rees, C. A., Pica, N., Monuteaux, M. C., & Bourgeois, F. T. (2019). Noncompletion and nonpublication of trials studying rare diseases: A cross-sectional analysis. PLoS Medicine, 16(11), e1002966.CrossRefGoogle ScholarPubMed
Robishaw, J. D., DeMets, D. L., Wood, S. K., Boiselle, P. M., & Hennekens, C. H. (2020). Establishing and maintaining research integrity at academic institutions: Challenges and opportunities. The American Journal of Medicine, 133(3), e87e90.CrossRefGoogle ScholarPubMed
Rosenberg, A., Baróniková, S., Feighery, L., Gattrell, W., Olsen, R. E., Watson, A., … Winchester, C. (2021). Open Pharma recommendations for plain language summaries of peer-reviewed medical journal publications. Current Medical Research and Opinion, 37(11), 20152016.CrossRefGoogle ScholarPubMed
Sommariva, S., Vamos, C., Mantzarlis, A., Đào, L. U. L., & Martinez Tyson, D. (2018). Spreading the (fake) news: Exploring health messages on social media and the implications for health professionals using a case study. American Journal of Health Education, 49(4), 246255.CrossRefGoogle Scholar
Stroth, N. (2016). The central importance of laboratories for reducing waste in biomedical research. Science and Engineering Ethics, 22, 17071716.CrossRefGoogle ScholarPubMed
Suarez-Lledo, V., & Alvarez-Galvez, J. (2021). Prevalence of health misinformation on social media: Systematic review. Journal of Medical Internet Research, 23(1), e17187.CrossRefGoogle ScholarPubMed
Van Noorden, R. (2023). Medicine is plagued by untrustworthy clinical trials. How many studies are faked or flawed? Nature, 619(7970), 454458.CrossRefGoogle ScholarPubMed
Vernon, M. M., Balas, E. A., & Momani, S. (2018). Are university rankings useful to improve research? A systematic review. PloS One, 13(3), e0193762.CrossRefGoogle ScholarPubMed
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359, 11461151.CrossRefGoogle ScholarPubMed
Walter, N., Brooks, J. J., Saucier, C. J., & Suresh, S. (2021). Evaluating the impact of attempts to correct health misinformation on social media: A meta-analysis. Health Communication, 36(13), 17761784.CrossRefGoogle ScholarPubMed
Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240, 112552.CrossRefGoogle ScholarPubMed
Yordanov, Y., Dechartres, A., Porcher, R., Boutron, I., Altman, D. G., & Ravaud, P. (2015). Avoidable waste of research related to inadequate methods in clinical trials. BMJ, 350, h809. doi: 10.1136/bmj.h809CrossRefGoogle ScholarPubMed
Zhou, Y., & Dahal, S. (2024). Has R&D contributed to productivity growth in China? The role of basic, applied and experimental R&D. China Economic Review, 102281.Google Scholar
Figure 0

Table 1. Country affiliation of 21st century STEM Nobel laureates at the time of the award (Nobel Prize Organization, 2023)

Figure 1

Table 2. Major quality deficiencies and their estimated frequencies in biomedical research

Figure 2

Table 3. Shifting sources of research production: “Stroke”[MeSH Term] more than 50 articles

Figure 3

Table 4. Major sources and risks of health misinformation