Hostname: page-component-cd9895bd7-jkksz Total loading time: 0 Render date: 2024-12-25T01:42:11.644Z Has data issue: false hasContentIssue false

Outcomes from research collaborations: What are they and how long do they take?

Published online by Cambridge University Press:  15 July 2013

Lubella A. Lenaburg
Affiliation:
Lubella A. Lenaburg, Elizabeth S. Sciaky, Tresa M. Pollock, University of California–Santa Barbara
Elizabeth S. Sciaky
Affiliation:
Lubella A. Lenaburg, Elizabeth S. Sciaky, Tresa M. Pollock, University of California–Santa Barbara
Tresa M. Pollock
Affiliation:
Lubella A. Lenaburg, Elizabeth S. Sciaky, Tresa M. Pollock, University of California–Santa Barbara

Abstract

Type
Other
Copyright
Copyright © Materials Research Society 2013 

Materials science has become a highly interdisciplinary field, with research challenges that increasingly require sophisticated instrumentation, experiments, theory, and modeling. It is often the case that no single institution or even a single country possesses the full set of expertise and resources needed for the major research challenges. While this broadening scope motivates research collaborations, there is little quantitative information about how long it takes for collaborations to develop and bear fruit following their initiation. Also, the limited literature available on the success of research collaborations focuses on publications, ignoring other significant outcomes of scientific interactions, such as patents, proposals, research visits, invited talks, and career advancement. Here, we present data on the timeline for scientific outcomes to develop from collaborations. Outcomes often occur many years after a collaboration is initiated, suggesting that long-term tracking over time periods extending beyond the period covered by a typical collaborative research grant is essential to capturing a significant portion of outcomes. Information gained from tracking can be critical for securing funding for programs designed to foster collaboration, particularly where reporting the success from previous programs is required.

The US National Science Foundation (NSF) has recognized the importance of international collaborations, and provides funding to bring scientists together through a variety of programs, such as the International Materials Institutes (IMI), Partnership for International Research and Education (PIRE), and the Materials World Network. One NSF project supported through the IMI program is the International Center for Materials Research (ICMR), initiated in 2004 at the University of California–Santa Barbara and in its ninth year of operation.

To date, the Center has hosted 85 schools and workshops on emerging research topics in materials science, supporting almost 2500 faculty, postdoctoral researchers, and graduate and undergraduate students from a wide range of materials science-related disciplines from 764 institutions in 65 countries, including the United States. The Center has also provided funding for US scientists to participate in over 400 extended research visits outside the United States, including significant resources for junior scientists. Because we have information from scientists at all stages in their careers, across many fields, and from a large number of countries around the world, we postulate that our findings based on this information likely apply to the broader scientific community and will have implications for programs that aim to increase opportunities for scientific collaboration, particularly on the international scale.

An important element of the ICMR has been its detailed ongoing evaluation plan. Central to this plan is the annual participant tracking initiated in 2006 with a survey sent to all former participants, requesting information on outcomes that occurred since the time of initial participation. A renewal of funding in 2009 provided an opportunity to conduct long-term tracking on participants. The latest tracking cycle in 2013 included 2475 participants from 2004 to 2013 who received funding from the Center to participate in research visits, schools, or workshops. Considering the busy schedules of this large number of participants, most of whom only participated in the program for a few days, we did not expect everyone would take time to report each year. However, making the survey available online rather than through email response, and allowing a sufficient window for participants to submit their responses, has helped improve our response rate. Allowing participants four to six weeks to respond, and sending regular reminders, are both key to collecting more data, given that this population tends to travel frequently and has many pressing deadlines. Annual response rates are currently over 30%, and over half (1309) of the 2475 participants have responded to at least one tracking request. Of these, 37%, or 481 of the 1309, have reported at least one outcome.

From this annual tracking, we have learned that the development of significant research outcomes does not always immediately follow from program interactions. An outcome may take years to develop or last many years. Additionally, when outcomes do occur, they are not always immediately reported. While many outcomes are reported within the first two or three years following participation, we have found that a large portion of outcomes may take up to nine years to be reported. A significant portion of this delay in reporting is due to the time it takes for outcomes to develop rather than just lack of response to the tracking survey.

To explore how long it takes for outcomes to occur and be reported, we first looked at publications from 2004 to 2009 cohorts, who participated during the initial round of funding. These participants have reported 243 publications to date. However, not all were published or reported by 2009. Thirty-seven percent were published in 2010 or later, and 32% were published by 2009 but were not reported until 2010 or later. This means the program would be unable to report on 69% of the publications from these cohorts if tracking had ceased in 2009. Given that the 2009–2013 cohorts have reported 109 publications so far, we expect to see a similar pattern as we continue tracking them.

Overall, we have seen a delay of up to six years between participation and the year a manuscript is published, and up to nine years between the year a manuscript is published and the year it is reported. While nothing can be done to completely eliminate either form of delay, continued tracking over many years will increase the chances that at some point the outcome will be reported. The first five cohorts continue to report outcomes each tracking cycle, and their data demonstrate that approximately 30% of publications take more than two years to be reported once they are published. This leads to roughly half of publications taking three to nine years to be reported following participation.

While publications are an important outcome to track, outcomes from participants since 2004 demonstrate that non-publication outcomes (936) outnumber publications (363) almost three to one. Most common is an interaction that results in the initiation of a collaboration where experimental samples are exchanged, expertise is shared, or access to unique facilities is gained. Participating scientists have also subsequently submitted proposals or have filed joint patents. Faculty have reported finding postdoctoral researchers to work in their laboratories, and graduate students and postdoctoral researchers report being offered positions from faculty. Participants report making valuable personal contacts, and the benefits to their research that come from discussing ideas. Combining these with publications brings the total number of outcomes for all cohorts to 1299. The collaboration network from these outcomes is one that connects researchers from around the world (see Figure 1).

Figure 1. The International Center for Materials Research network of collaborations, based on 2006–2013 reporting from all cohorts. Each dot represents at least one participant at an institution, color-coded by country.

Figure 2 shows the proportion of all outcomes reported each year by cohort. Blue represents data collected within the first round of funding, and red represents data collected since the first round, which we would not have if we had not continued tracking. Only instances of new outcomes were counted here, meaning that if someone reported ongoing work with the same collaborator over a period of years without reporting new outcomes, such as proposals or publications, only the initial report of working together was counted.

Figure 2. The data from the first three cohorts demonstrate that less than half of all outcomes are reported within three years of participation.

It is worth noting that all five cohorts from the first round of funding have reported outcomes every year since the first round ended, and these account for 64% of the total outcomes reported by these cohorts. As these cohorts continue to report outcomes each tracking cycle, the percentage of total outcomes reported within their first few years of participation continues to drop. We expect the bars for the most recent cohorts to eventually look like those for the first three cohorts, which currently show that less than half of outcomes are reported within three years of participation. We expect some of this delay in reporting is due to a delay in occurrence, as we saw for publications, though this was not possible to explore in full for non-publication outcomes due to the exact time of occurrence not always being as clearly defined as a date of publication.

In conclusion, our data from the earliest cohorts demonstrate that less than half of outcomes are reported within the first three years, and that outcomes might take as many as nine years to be reported. While collaborations naturally require time to become productive, and it is not possible to completely eliminate reporting delays, we have employed strategies to improve our annual response rates from 10% to over 30%.

This improvement in our response rate likely reduces the delay in reporting overall, though it is not possible to say by how much. Even after employing these strategies, we believe that it is important to track participants beyond the first three years, for as many as 10 years, after participation in order to properly measure the impact of funding on research.

Outcomes that are important to track in addition to publications are anything that demonstrates dissemination or collaborative work, which includes conference presentations and invited talks, patents, proposals, research visits and student exchanges, and career opportunities. When asking participants to report outcomes, it is important to give examples of the types of outcomes they should report, and also to encourage them to report any other outcomes they recognize as valuable so long as they can explain how these are due to their program participation. The importance of each type of outcome may be equal or weighted depending on the project goals. Funding agencies and principal investigators should be aware that if they do not provide resources to continue tracking beyond the end of the typical three-to-four-year project grant, the full impact of the program may be significantly underreported. The time invested in this tracking will be beneficial when principal investigators seek new funding, because they can provide data that better reflect the success of their previous projects.

Figure 0

Figure 1. The International Center for Materials Research network of collaborations, based on 2006–2013 reporting from all cohorts. Each dot represents at least one participant at an institution, color-coded by country.

Figure 1

Figure 2. The data from the first three cohorts demonstrate that less than half of all outcomes are reported within three years of participation.