Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-18T10:42:48.091Z Has data issue: false hasContentIssue false

Proliferation of Faulty Materials Data Analysis in the Literature

Published online by Cambridge University Press:  17 January 2020

Matthew R. Linford
Affiliation:
Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT84602, USA
Vincent S. Smentkowski*
Affiliation:
General Electric Research, Niskayuna, NY12309, USA
John T. Grant
Affiliation:
Surface Analysis Consultant, Clearwater, FL33767, USA
C. Richard Brundle
Affiliation:
C.R. Brundle & Associates, Soquel, CA95073, USA
Peter M.A. Sherwood
Affiliation:
University of Washington, Box 351700, Seattle, WA98195, USA
Mark C. Biesinger
Affiliation:
Surface Science Western, University of Western Ontario, London, OntarioN6G 0J3, Canada
Jeff Terry
Affiliation:
Department of Physics, Illinois Institute of Technology, Chicago, IL60616, USA
Kateryna Artyushkova
Affiliation:
Physical Electronics, Chanhassen, MN55317, USA
Alberto Herrera-Gómez
Affiliation:
CINVESTAV – Unidad Queretaro, Real de Juriquilla76230, Mexico
Sven Tougaard
Affiliation:
Department of Physics, University of Southern Denmark, Odense5230, Denmark
William Skinner
Affiliation:
Future Industries Institute, University of South Australia, Mawson Lakes, SA 5095, Australia
Jean-Jacques Pireaux
Affiliation:
University of Namur, Namur Institute of Structured Matter, B-5000Namur, Belgium
Christopher F. McConville
Affiliation:
College of Science, RMIT University, Melbourne, VIC3001, Australia
Christopher D. Easton
Affiliation:
CSIRO Manufacturing, Ian Wark Laboratories, Clayton, VIC3168, Australia
Thomas R. Gengenbach
Affiliation:
CSIRO Manufacturing, Ian Wark Laboratories, Clayton, VIC3168, Australia
George H. Major
Affiliation:
Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT84602, USA
Paul Dietrich
Affiliation:
SPECS Surface Nano Analysis GmbH, 13355Berlin, Germany
Andreas Thissen
Affiliation:
SPECS Surface Nano Analysis GmbH, 13355Berlin, Germany
Mark Engelhard
Affiliation:
Pacific Northwest National Laboratory, Richland, WA99354, USA
Cedric J. Powell
Affiliation:
National Institute of Standards and Technology, Gaithersburg, MD20899, USA
Karen J. Gaskell
Affiliation:
University of Maryland, College Park, MD20742, USA
Donald R. Baer
Affiliation:
Pacific Northwest National Laboratory, Richland, WA99354, USA
*
*Author for correspondence: Vincent S. Smentkowski, E-mail: [email protected]

Abstract

Type
Letter to the Editor
Copyright
Copyright © Microscopy Society of America 2020

As a group of subject-matter experts in X-ray photoelectron spectroscopy (XPS) and other material characterization techniques from different countries and institutions, we write this document to raise awareness of an epidemic of poor and incorrect materials data analysis in the literature. This issue is a growing problem with many causes and very undesirable consequences. It contributes to what has been called a “reproducibility crisis”, which is a recent concern of the U.S. National Academies of Science (Baker, Reference Baker2016; Harris, Reference Harris2017; NASE&M, 2019).

Over the past decade material analysis techniques have matured to the point that dedicated expert operators are often not considered to be necessary to collect and analyze data, especially when the samples are perceived as simple or routine. The tools in this growing arsenal, including XPS, are now used in academia, industry, and government laboratories to provide both compositional information and a mechanistic understanding of a wide variety of materials. This situation, coupled with increased accessibility of the equipment, improved instrument reliability, and the promise of useful data, has resulted in significant growth in the number of researchers using these characterization tools and reporting material analysis data. Although many of the resulting papers are of high quality, especially in journals that focus on materials characterization, others are unsatisfactory. In an ongoing analysis of XPS data in journals that emphasize next generation materials, we find that about 30% of the analyses are completely incorrect (Linford and Major, Reference Linford and Major2019). Thus, for some applications, inappropriate data analysis has reached a critical stage, making it difficult for researchers lacking the relevant expertise to find and readily identify reliable examples of what would be considered good-quality data analysis. The errors we are observing in the literature are not limited to journals that may be deemed to be of lower impact—they regularly appear in what are identified as upper-tier/high-impact-factor journals. It is not uncommon to similarly find that 20–30% of the analyses of data from other material characterization techniques are also incorrect (Chirico et al., Reference Chirico, Frenkel, Magee, Diky, Muzny, Kazakov, Kroenlein, Abdulagatov, Hardin, Acree, Brenneke, Brown, Cummings, de Loos, Friend, Goodwin, Hansen, Haynes, Koga, Mandelis, Marsh, Mathias, McCabe, O'Connell, Pádua, Rives, Schick, Trusler, Vyazovkin, Weir and Wu2013; Park et al., Reference Park, Howe and Sholl2017). The consequences of this issue are significantly greater than merely having a few poorly executed figures in otherwise good papers. Results and conclusions in a study hinge on the data collected and analyzed. If the characterization of a material is incorrect, an entire work may be fundamentally flawed. In some areas, the proliferation of advanced analytical instruments appears to have exceeded the world's supply of expertise necessary to collect, interpret, and review the results obtained from them.

Some sub-disciplines in science only require a single analytical/measurement tool or just a few tools for a complete analysis of their systems. In contrast, materials analysis generally requires multiple advanced-characterization techniques to obtain an appropriate understanding of a new thin film or material (Baer & Gilmore, Reference Baer and Gilmore2018). These techniques typically require an understanding of the physics and chemistry behind them, can be performed in multiple modes, and often require detailed first-principles and/or established empirical/semi-empirical modeling for their data reduction. Furthermore, each technique is supported by an extensive literature written by experts. Because of the need for information from these methods, the burden placed on materials researchers is heavy. In addition to a requirement to develop novel materials, they must characterize them at a high level with multiple analytical tools. Of course, not every materials problem requires advanced data analysis. Many important quality-control and device-failure problems have been solved by a basic application of one or more pieces of modern characterization equipment. However, in mature industries and fields, advances are more often made through the development of a detailed, comprehensive understanding of materials. In these cases, inadequate data collection and unsatisfactory analysis impede progress.

This epidemic has a plethora of consequences. When too much of the literature is corrupted by poorly collected and poorly interpreted results, a resource that was designed to further the cause of research is compromised. Unfortunately, the literature does not have a generally accepted mechanism for identifying studies (or portions thereof) of questionable value, so incorrect results may influence the thinking, direction, and future research of other scientists and engineers. Anecdotally, we note that, as analysts, we are often asked to reproduce or follow a protocol from the literature that is fundamentally flawed. Incorrect precedent is sometimes cited in the literature, which perpetuates errors. Results from materials characterization influence business decisions, and graduate students and researchers who ought to be able to learn from the literature are misinformed.

In our view, the fact that many incorrect analyses are appearing in the literature is a systemic problem. Researchers are under intense pressure to publish, and without some change to the system, they will most likely continue to do their work as they have in the past. Instrument manufacturers are, at least indirectly, if unintentionally, complicit; they have developed high-quality and easy-to-operate systems that may mislead customers into believing that data collection and analysis from their instruments are straightforward endeavors. This certainly may be the case for some routine samples, but not for all materials. Moderately to very complex materials such as nanoparticles, nanostructured, and two- or three-dimensional materials, catalysts, and anisotropic or graded materials, require a more nuanced approach. Reviewers and editors of manuscripts are often experts in the synthesis and/or development of a particular type of material, and in this sense are appropriately chosen to evaluate certain classes of manuscripts. However, they often do not possess a detailed understanding of all the analytical methods that may have been used to characterize the new materials described in the documents they review. Thus, the structure, traditions, constraints, and pressures of the current scientific endeavor often lead to the publication of faulty or misleading data analysis (Baer & Gilmore, Reference Baer and Gilmore2018). A partial solution that some of us are applying to the review process is to review only portions of papers for which we have the needed expertise and to clearly inform the editors of the areas where we were not qualified to provide a needed evaluation. This is only a preliminary suggestion. For evaluation purposes it will be useful to have authors include more detailed information about characterization in the supplemental information sections allowed by many journals. As this discussion and the analysis of this problem have progressed, our consensus of a solution has become multifaceted, with emphasis on each level of stakeholder. A more detailed analysis of the current problems in XPS data analysis, along with specific actions that can be taken to address the issues, is forthcoming.

Some of us are in the process of writing a series of guides, tutorials, and recommended protocol articles on XPS that are being published in the Journal of Vacuum Science and Technology (Shah et al., Reference Shah, Patel, Roychowdhury, Rayner, O'Toole, Baer and Linford2018; Baer et al., Reference Baer, Artyushkova, Brundle, Castle, Engelhard, Gaskell, Grant, Haasch, Linford, Powell, Shard, Sherwood and Smentkowski2019). We believe that these will be an aid to those who wish to acquaint themselves with the technique, so that they can avoid some of the common pitfalls in XPS data analysis and reporting. Many of us have also been involved in developing documentary standards for XPS and other surface-analysis methods that have been published by ASTM International and the International Organization for Standardization (ISO). High-quality surface-analysis data have also been published in Surface Science Spectra. The guides and papers we are developing will include lists of common errors made in XPS data analysis, recommendations to all the stakeholders in this issue, and a more detailed, quantitative analysis of the problem. Similar guidance and reference data, for example, ASTM and ISO standards, have also been developed for other material characterization techniques. We commend the efforts of other groups of experts that similarly teach appropriate data analysis for their methods and call upon the scientific community to pay greater heed to the more accurate work-up and publication of instrumental data.

We close by reiterating that while the focus of this document has been on XPS, similar trends and problems are being noted in all areas of materials characterization.

Dr. C. J. Powell contributed to this communication in a personal capacity. The views expressed are his own and do not necessarily represent the views of the National Institute of Standards and Technology or the United States Government.

References

Baer, DR, Artyushkova, K, Brundle, CR, Castle, JE, Engelhard, MH, Gaskell, KJ, Grant, JT, Haasch, RT, Linford, MR, Powell, CJ, Shard, AG, Sherwood, PMA & Smentkowski, VS (2019). Practical guides for X-ray photoelectron spectroscopy: First steps in planning, conducting, and reporting XPS measurements. J Vac Sci Technol A 37(3), 031401.CrossRefGoogle ScholarPubMed
Baer, DR & Gilmore, IS (2018). Responding to the growing issue of research reproducibility. J Vac Sci Technol A 36(6), 068502.CrossRefGoogle Scholar
Baker, M (2016). 1500 scientists lift the lid on reproducibility. Nature 533(7604), 452454.CrossRefGoogle ScholarPubMed
Chirico, RD, Frenkel, M, Magee, JW, Diky, V, Muzny, CD, Kazakov, AF, Kroenlein, K, Abdulagatov, I, Hardin, GR, Acree, WE Jr., Brenneke, JF, Brown, PL, Cummings, PT, de Loos, TW, Friend, DG, Goodwin, ARH, Hansen, LD, Haynes, WM, Koga, N, Mandelis, A, Marsh, KN, Mathias, PM, McCabe, C, O'Connell, JP, Pádua, A, Rives, V, Schick, C, Trusler, JPM, Vyazovkin, S, Weir, RD & Wu, J (2013). Improvement of quality in publication of experimental thermophysical property data: Challenges, assessment tools, global implementation, and online support. J Chem Eng Data 58(10), 26992716.CrossRefGoogle Scholar
Harris, R (2017). Reproducibility issues. Chem Eng News 95(47), 2.Google Scholar
Linford, MR & Major, GH (2019). Gross errors in XPS peak fitting. In American Vacuum Society Meeting, Columbus, OH.Google Scholar
National Academies of Sciences, Engineering, & Medicine (2019). Reproducibility and Replicability in Science. Washington, DC: The National Academies Press.Google Scholar
Park, J, Howe, JD & Sholl, DS (2017). How reproducible are isotherm measurements in metal-organic frameworks? Chem Mater 29(24), 1048710495.CrossRefGoogle Scholar
Shah, D, Patel, DI, Roychowdhury, T, Rayner, GB, O'Toole, N, Baer, DR & Linford, MR (2018). Tutorial on interpreting x-ray photoelectron spectroscopy survey spectra: Questions and answers on spectra from the atomic layer deposition of Al2O3 on silicon. J Vac Sci Technol B 36(6), 062902.CrossRefGoogle Scholar