1. Introduction
Researchers are becoming increasingly aware of the importance of reproducibility and transparency in scientific research and reporting [Reference Munafò, Nosek and Bishop1, Reference Nosek, Alter and Banks2]. A well-documented “replication crisis” in psychology and other disciplines has shown that engrained academic incentives encouraging novel research have led to biased and irreproducible findings [Reference Ioannidis3–Reference Open Science Collaboration6]. Researchers, journals, and funding organisations across psychology and health sciences are contributing to reforming scientific practice to improve the credibility and accessibility of research [Reference Munafò, Nosek and Bishop1, Reference Norris and O’Connor7].
“Open Science,” where some or all parts of the research process are made publicly and freely available, is essential for increasing research transparency, credibility, reproducibility, and accessibility [Reference Kathawalla, Silverstein and Syed8]. Reproducibility-facilitating research behaviours are varied and occur throughout the research life cycle. During study design, pre-registration and protocols specify the hypotheses, methods, and analysis plan to be used in proposed subsequent research in repositories such as Open Science Framework and AsPredicted. Such specification is designed to reduce researcher degrees of freedom and undisclosed flexibility, ensuring features such as primary and secondary hypotheses and analysis plans remain fixed and preventing “p-hacking” [Reference Head, Holman, Lanfear, Kahn and Jennions9]. Within health research, pre-registration and protocol sharing also facilitate future replication and real-world adoption of medical and behavioural interventions [Reference Huebschmann, Leavitt and Glasgow10]. During data analysis, scripts can be made more reproducible by marking their code with step-by-step comments, improving clarity and replication [Reference van Vliet11]. During dissemination, materials (such as intervention protocols and questionnaires), data, and analysis scripts can be made available by uploading to repositories such as Open Science Framework or GitHub [Reference Klein, Hardwicke and Aust12], facilitating the replication of effective research and interventions [Reference Heirene13]. Allowing data and trial reports to be made available regardless of their findings enables a more accurate picture of the full state of research, minimising the “file drawer” problem by which positive findings are more likely to be published than negative findings [Reference Rotton, Foos, Van Meek and Levitt14]. Sharing data and analysis code also allows for checking of research findings and conclusions, as well as easier synthesis of related findings via meta-analyses [Reference Ross15]. Transparency-facilitating research behaviours include reporting sources of research funding and conflicts of interest [Reference Fontanarosa, Flanagin and DeAngelis16, Reference Smith17]. These are important in that they help readers to make informed judgements about potential risks of bias [Reference Cristea and Ioannidis18].
Metascience studies have assessed markers of reproducibility and transparency in the related domains of psychology and life sciences. A recent study exploring 250 psychology studies of varying study designs published between 2014 and 2017 found transparency and reproducibility behaviours to be infrequent [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Although public availability of studies via open access was common (65%), sharing of research resources was low for materials (14%), raw data (2%), and analysis scripts (1%). Pre-registration (3%) and study protocols (0%) were also infrequent [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Transparency of reporting was inconsistent for funding statements (62%) and conflict of interest disclosure statements (39%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Metascience studies have assessed reproducibility and transparency across other disciplines, including 250 studies in social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], 149 studies in biomedicine [Reference Wallach, Boyack and Ioannidis21], and 480 studies across two journals in biostatistics [Reference Rowhani-Farid and Barnett22], all with no restrictions on study designs. Other research has focused on the prevalence of specific reproducibility behaviours, such as the prevalence of open access publications, finding about 45% across scientific discipline assessed in 2015 [Reference Piwowar, Priem and Larivière23].
However, the extent of reproducibility and transparency behaviours in public health research, including smoking cessation, is currently unclear. A recent investigation of randomised controlled trials addressing addiction found data sharing to be nonexistent. 0/394 trials were found to make their data publicly available, with 31.7% of included trials addressing tobacco addiction [Reference Vassar, Jellison, Wendelbo and Wayant24]. It must be noted that various persistent barriers to data sharing exist, including technical, motivational, economic, political, legal, and ethical considerations (van Panhuis et al., 2014), which may limit the uptake of this specific Open Science behaviour. Markers of wider reproducibility behaviours are yet to be assessed in addiction research.
Transparent reporting in terms of funding and conflicts of interest is especially crucial for smoking cessation, where tobacco and pharmaceutical companies fund some research directly or indirectly [Reference Garne, Watson, Chapman and Byrne25]. Such vested interests may distort the reporting and interpreting of results, and this may especially be the case in areas of controversy such as e-cigarette research [Reference Heirene13, Reference Smith17, Reference Munafò and West26, Reference West27]. The aim of the current study is to assess markers of (i) reproducibility and (ii) transparency within smoking intervention evaluation reports.
2. Methods
2.1. Study Design
This was a retrospective observational study with a cross-sectional design. Sampling units were individual behaviour change intervention reports. This study applied a methodology used to assess reproducibility and transparency in the wider psychological sciences [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19] and social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20] to the context of smoking randomised controlled trial intervention reports. This study was pre-registered: https://osf.io/yqj5p. All deviations from this protocol are explicitly acknowledged in the appendix.
2.2. Sample of Reports
The Cochrane Tobacco Group Specialised Register of controlled trials was searched in November 2019, identifying 1630 reports from 2018 to 2019. Inclusion criteria were randomised controlled trials published in 2018 and 2019. Exclusion criteria were trial protocols, abstract-only entries, and economic or process evaluations. Of the 157 reports remaining after applying these criteria, 100 reports were selected due to time and resource constraints using a random number generator. PDFs were obtained from journal websites. These reports were also already included in the ongoing Human Behaviour-Change Project ([Reference Michie, Thomas and Johnston28, Reference Michie, Thomas and Mac Aonghusa29], https://osf.io/efp4x/), working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations. A list of all 100 reports included in this study is available: https://osf.io/4pfxm/.
2.3. Measures
Article characteristics extracted in this study were as follows: (i) 2018 journal impact factor for each report using the Thomson Reuters Journal Citation Reports facility and (ii) country of the corresponding author (Table 1). Additional article characteristics already extracted as part of the Human Behaviour-Change Project are also reported: (iii) smoking outcome behaviour (smoking abstinence, onset, reduction, quit attempt, or second-hand smoking) and (iv) behaviour change techniques (BCTs) in the most complex intervention group, coded using the Behaviour Change Techniques Taxonomy v1 [Reference Michie, Richardson and Johnston30]. In short, data from the Human Behaviour-Change Project was extracted using EPPI-Reviewer software [Reference Thomas, Brunton and Graziosi31] by two independent reviewers before their coding was reconciled and agreed. The full process of manual data extraction within the Human Behaviour-Change Project [Reference Bonin, Gleize and Finnerty32]. All extracted data on included papers is available: https://osf.io/zafyg/.
∗If a response marked with an asterisk is selected, the coder is asked to provide more detail in a free text response box. Note: identified measured variables have been adapted from a previous study assessing the transparency and reproducibility in psychological sciences [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19].
Markers of research reproducibility were assessed by recording the presence of the following in included reports: (i) pre-registration: whether pre-registration was reported as carried out, where the pre-registration was hosted (e.g., Open Science Framework and AsPredicted), whether it could be accessed, and what aspects of the study were pre-registered; (ii) protocol sharing: whether a protocol was reported as carried out and what aspects of the study were included in the protocol; (iii) data sharing: whether data was available, where it was available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), whether the data was downloadable and accessible, whether data files were clearly documented, and whether data files were sufficient to allow replication of reported findings; (iv) material sharing: whether study materials were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the materials were downloadable and accessible; (v) analysis script sharing: whether analysis scripts were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the analysis scripts were downloadable and accessible; (vi) replication of a previous study: whether the study claimed to be a replication attempt of a previous study; and (vii) open access publication: whether the study was published as open access.
Markers of research transparency were assessed by recording the presence of the following in included reports: (i) funding sources: whether funding sources were declared and if research was funded by public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies; (ii) conflicts of interest: whether conflicts of interest were declared and whether conflicts were with public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies. All measured variables are shown in Table 1.
2.4. Procedure
Data collection took place between February and March 2020. Data for all measures were extracted onto a Google Form (https://osf.io/xvwjz/). All reports were independently coded by two researchers. Any discrepancies were resolved through discussion, with input from a third researcher if required.
2.5. Analysis
Research reproducibility was assessed using the markers of pre-registration; sharing of protocols, data, materials, and analysis scripts; replication; and open access publishing (Table 1). Research transparency was assessed using the markers of funding source and conflicts of interest declarations. Inter-rater reliability of the independent coding of the two researchers was calculated using Krippendorff’s alpha [Reference Hayes and Krippendorff33] using Python 3.6 (https://github.com/HumanBehaviourChangeProject/Automation-InterRater-Reliability).
3. Results
Inter-rater reliability was assessed as excellent across all coding, a=0.87. Full data provided on OSF: https://osf.io/sw63b/.
3.1. Sample Characteristics
Seventy-one out of 100 smoking behaviour change intervention reports were published in 2018 and 29 published in 2019. Out of the 100 reports, four had no 2018 journal impact factor, with the remaining 96 reports having impact factors ranging from 0.888 to 70.67 (mean=4.95). Fifty-four out of 100 reports took place in the United States of America (https://osf.io/j2zp3/). Data from the Human Behaviour-Change Project identified that out of the 100 reports, 94 had a primary outcome behaviour of smoking abstinence, two of smoking onset and smoking reduction, respectively, and one of quit attempts and second-hand smoking, respectively. Forty-six out of the total 93 behaviour change techniques (BCTs) within the Behaviour Change Techniques Taxonomy (BCTTv1) were identified in the included reports. An average of 4.41 BCTs was identified in each report. The most commonly identified BCTs were as follows: social support (unspecified) (BCT 3.1, n=65/100), pharmacological support (BCT 11.1, n=61/100), problem solving (BCT 1.2, n=42/100), and goal setting (behaviour) (BCT 1.1, n=34/100). A figure of all outcome behaviour and BCT codings can be found: https://osf.io/6w3f4/.
3.2. Markers of Reproducibility in Smoking Behaviour Change Intervention Evaluation Reports
Final reconciled coding of reproducibility and transparency for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.
3.2.1. Article Availability (Open Access)
Seventy-one out of 100 smoking behaviour change intervention reports were available via open access, with 29 only accessible through a paywall (Figure 1(a)).
3.2.2. Pre-registration
Seventy-three out of 100 smoking behaviour change intervention reports stated that they were pre-registered, with 72 of these being accessible. Fifty-four studies were pre-registered at ClinicalTrials.gov, with the remainder pre-registered at the International Standard Randomized Clinical Trial Number registry (ISRCTN; n=7), the Australian and New Zealand Clinical Trials Registry (ANZCTR; n=4), Chinese Clinical Trial Registry (ChCTR; n=2), Netherlands Trial Register (NTR; n=2), Iranian Clinical Trials Registry (IRCT; n=1), Clinical Research Information Service in Korea (CRIS; n=1), or the UMIN Clinical Trials Registry in Japan (UMIN-CTR; n=1).
All of the 72 accessible pre-registrations reported methods, with 2 also reporting hypothesis. Only two accessible pre-registrations included hypothesis, methods, and analysis plans. Twenty-six of the 100 reports did not include any statement of pre-registration. One report stated the study was not pre-registered (Figure 1(b)).
3.2.3. Protocol Availability
Seventy-one out of 100 smoking behaviour change intervention reports did not include a statement about protocol availability. For the 29 reports that included accessible protocols, 23 had a protocol that included hypothesis, methods, and analysis plans. Three reports only had methods in their protocol, whereas two of them included both hypothesis and methods, and one of them included methods and analysis plans (Figure 1(c)).
3.2.4. Material Availability
Twenty-two out of 100 reports included a statement saying the intervention materials used were available. Sixteen of these reports provided materials via journal supplementary files, and six reports stated that their materials were only available upon request from the authors (Figure 1(d)).
3.2.5. Data Availability
Sixteen out of 100 reports included a data availability statement. Nine reports stated data was available upon request from the authors, and one stated the data was not available. The remaining six articles included their data in the supplementary files hosted by the journals, but one article’s data file could not be opened. Four of the remaining articles had clearly documented data files, but only two of them contained all necessary raw data. As such in total, only seven reports provided links to data that was actually accessible (Figure 1(e)).
3.2.6. Analysis Script Availability
Three out of 100 reports included an analysis script availability statement. However, only one provided accessible script as a supplementary file, with the remaining two stating analysis script available upon request from authors (Figure 1(f)).
3.2.7. Replication Study
None of the 100 smoking behaviour change intervention reports were described as replication studies (Figure 1(g)).
3.3. Markers of Transparency in Smoking Behaviour Change Intervention Evaluation Reports
Final reconciled coding of reproducibility and transparency markers for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.
3.3.1. Funding
Ninety-four of the 100 smoking behaviour change intervention reports included a statement about funding sources. Most of the reports disclosed public funding only such as via government-funded research grants, charities, or universities (n=80). Eight reports disclosed both public funding and funding from private companies. Five reports disclosed funding from private companies only, including pharmaceutical (n=3), tobacco companies (n=1), and other companies (n=1). One report reported receiving no funding (Figure 1(h)).
3.3.2. Conflicts of Interest
Eighty-eight of the 100 articles provided a conflict of interest statement. Most of these reports reported that there were no conflicts of interest (n=51). Thirty-seven reports reported that there was at least one conflict of interest, including from a pharmaceutical company (n=27), private company (n=17), public organisation (n=13), and tobacco company (n=3) (Figure 1(i)).
4. Discussion
This assessment of 100 smoking behaviour change intervention evaluation reports identified varying levels of research reproducibility markers. Most reports were open access and pre-registered; however, research materials, data, and analysis scripts were not frequently provided and no replication studies were identified. Markers of transparency assessed here by funding source and conflicts of interest declarations were common.
4.1. Assessment of Reproducibility Markers in Smoking Behaviour Change Intervention Evaluation Reports
Pre-registration, as a marker of research reproducibility, was found to be higher for smoking RCTs (73%) than in wider psychological research of varying study designs (3%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Open access reports were at similarly moderate levels (71%) to psychology (65%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], but greater than the 45% observed in the social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], 25% in biomedicine [Reference Wallach, Boyack and Ioannidis21], and 45% across scientific literature published in 2015 [Reference Piwowar, Priem and Larivière23]. This high rate of open access publishing in smoking interventions may reflect increasing requirements by health funding bodies for funded researchers to publish in open access outlets [Reference Severin, Egger, Eve and Hürlimann34, Reference Tennant, Waldner, Jacques, Masuzzo, Collister and Hartgerink35] and increasing usage of preprint publication outlets such as PsyArXiv for the psychological sciences and medRxiv for medical sciences.
The proportion of open materials was lower than in biomedicine (13% vs. 33%) [Reference Wallach, Boyack and Ioannidis21] but similar to the 11% of the social sciences [9]. Open analysis scripts were found to be as infrequently provided in smoking interventions as in wider psychological research (both 1%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biostatistics [Reference Rowhani-Farid and Barnett22].
Open data of smoking interventions was found to be very low (7%), but greater than the 0% estimate in a larger sample of 394 smoking RCTs [Reference Vassar, Jellison, Wendelbo and Wayant24] and to the 2% of wider psychological research [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Raw data are essential for meta-analyses to make sense of the diverse smoking cessation evidence. Common barriers for including studies in meta-analyses include a lack of available data, often after requests from authors [Reference Greco, Zangrillo, Biondi-Zoccai and Landoni36, Reference Ioannidis, Patsopoulos and Rothstein37]. Provision of raw data as supplementary files to published intervention reports or via trusted third-party repositories such as the Open Science Framework [Reference Klein, Hardwicke and Aust12] is important to facilitate evidence synthesis, especially in a field as important for global health as smoking cessation.
No replication attempts were identified in this sample of smoking intervention reports, compared to 5% in wider psychology studies [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19] and 1% in the social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20]. This lack of replication may be due to a lack of available resources of smoking interventions to facilitate replication, as identified in this study, or may reflect a lack of research prioritisation and funding for replication, with novel rather than confirmatory research prioritised at global, institutional levels [Reference Munafò, Nosek and Bishop1, Reference Open Science Collaboration6].
4.2. Assessment of Transparency Markers in Smoking Behaviour Change Intervention Evaluation Reports
Declaration of funding sources and conflicts of interest, as markers of research transparency, was found here to be commonly provided in smoking intervention evaluation reports. Funding sources were declared in more smoking reports (95%) than wider psychology (62%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences (31%) [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biomedical science reports (69%) [Reference Wallach, Boyack and Ioannidis21]. Similarly, a statement on conflicts of interest was provided more commonly in smoking interventions (88%) than wider psychology (39%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences (39%) [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biomedical science reports (65%) [Reference Wallach, Boyack and Ioannidis21]. Seventeen percent of studies reported conflicts from private companies and 3% from tobacco companies. The comparatively high level of transparency markers observed here in smoking interventions compared to other fields is likely to reflect improved reporting following previous controversies in the field [Reference Garne, Watson, Chapman and Byrne25, Reference Bero38, Reference Malone and Bero39]. Funding and disclosure statements are now commonly mandated by journals related to smoking cessation [Reference Cristea and Ioannidis18, Reference Munafò and West26, Reference Nutu, Gentili, Naudet and Cristea40].
4.3. Strengths and Limitations
A strength of this study is its use of double coding by two independent researchers of all reproducibility and transparency markers, enabling inter-rater reliability assessment. A limitation is that this study is based on a random sample of 100 evaluation reports of smoking behaviour change interventions, whereby assessments of reproducibility and transparency may not be generalizable to broader smoking interventions. Second, markers of reproducibility and transparency were dependent on what was described within evaluation reports. Direct requests to authors or additional wider searching of third-party registries such as Open Science Framework may have identified additional information indicating reproducibility. The absence of explicit statements on protocol, material, data, and analysis script availability may not necessarily signal that resources will not be shared by authors, but arguably does add an extra step for researchers to seek out this information. Third, this approach of assessing Open Science behaviours in reported research may omit more nuanced approaches to Open Science taken by journals or authors, which may make assessed figures lower than in actual practice.
4.4. Future Steps to Increase Reproducibility and Transparency of Smoking Interventions
Urgent initiatives are needed to address the low levels of reproducibility markers observed here in smoking intervention research, especially in the areas of open materials, data, analysis scripts, and replication attempts. As with any complex behaviour change, this transformation requires system change across bodies involved in smoking cessation research: researchers, research institutions, funding organisations, journals, and beyond [Reference Munafò, Nosek and Bishop1, Reference Norris and O’Connor7]. Interventions are needed to increase the capability, opportunity, and motivation of these bodies to facilitate behaviour change towards reproducible research in smoking interventions [Reference Michie, Thomas and Johnston28, Reference Michie, van Stralen and West41]. For example, capability can be addressed by providing researcher training, equipping them with the skills needed to make their research open and reproducible, such as how to use the Open Science Framework, how to preprint servers, and how to make their analysis reproducible. Opportunity to engage in reproducible research in smoking interventions can be facilitated within institutions, facilitating discussions around open and reproducible working [Reference Orben42] and developing a culture around valuing progressive and open research behaviours [Reference Norris and O’Connor7].
Motivation to research reproducibly can be addressed by providing researcher incentives [Reference Norris and O’Connor7]. Open Science badges recognising open data, materials, and pre-registration have been adopted by journals as a simple, low-cost scheme to increase researcher motivation to engage in these reproducibility behaviours [Reference Kidwell, Lazarević and Baranski43]. Open Science badges have been identified as the only evidence-based incentive program associated with increased data sharing [Reference Rowhani-Farid, Allen and Barnett44]. However, adoption of Open Science badges in smoking cessation journals is currently low, indicating this as one important initiative currently missing in this field. Future research could compare this study’s baseline assessment of reproducibility and transparency markers in smoking cessation intervention evaluation reports to assess changes in reporting and researcher behaviour.
5. Conclusions
Reproducibility markers of smoking behaviour change intervention evaluation reports were varied. Pre-registration of research plans and open access publication were common, whereas the provision of open data, materials, and analysis was rare and replication attempts were nonexistent. Transparency markers were common, with funding sources and conflicts of interest usually declared. Urgent initiatives are needed to improve reproducibility in open materials, data, analysis scripts, and replication attempts. Future research can compare this baseline assessment of reproducibility and transparency in the field of smoking interventions to assess changes.
Appendix
Updates to Preregistered Protocol
During the course of this study and peer review, we made minor adjustments to the preregistered protocol as follows:
-
(1) We revised the remit of “smoking cessation” to instead refer to “smoking behaviour change” more broadly. This allowed inclusion of cessation, reduction, and second-hand smoke intervention reports included within the Human Behaviour-Change Project knowledge system
-
(2) Within the article characteristics measured variables, we added “smoking cessation behaviour” to identify whether each report addressed smoking cessation, reduction, or second-hand smoke specifically
-
(3) Within the article characteristics measured variables, we added “behaviour change techniques” to specify the intervention content identified within each report. Behaviour change techniques were already coded within the parallel Human Behaviour-Change Project: working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations
Data Availability
All data are provided on OSF: https://osf.io/5rwsq/.
Conflicts of Interest
RW has undertaken research and consultancy for companies that develop and manufacture smoking cessation medications (Pfizer, J&J, and GSK). He is an unpaid advisor to the UK’s National Centre for Smoking Cessation and Training and a director of the not-for-profit Community Interest Company, Unlocking Behaviour Change Ltd. No other competing interests to disclose.
Acknowledgments
The authors would like to thank Ailbhe N. Finnerty for calculating inter-rater reliability. EN was employed during this study on The Human Behaviour-Change Project, funded by a Wellcome Trust collaborative award (grant number 201,524/Z/16/Z).