Hostname: page-component-586b7cd67f-r5fsc Total loading time: 0 Render date: 2024-11-23T19:56:48.915Z Has data issue: false hasContentIssue false

Best Practices for Publishing pXRF Analyses

Published online by Cambridge University Press:  19 April 2024

Kimberly Johnson
Affiliation:
Anthropology Department, Hamilton College, Clinton, NY, USA ([email protected]; [email protected])
Colin P. Quinn*
Affiliation:
Department of Anthropology, University at Buffalo, Buffalo NY, USA
Nathan Goodale
Affiliation:
Anthropology Department, Hamilton College, Clinton, NY, USA ([email protected]; [email protected])
Richard Conrey
Affiliation:
Hamilton Analytical Laboratory, Hamilton College, Clinton, NY, USA ([email protected])
*
([email protected], corresponding author)
Rights & Permissions [Opens in a new window]

Abstract

With its promise of nondestructive processing, rapid low-cost sampling, and portability to any field site or museum in the world, portable X-ray fluorescence (pXRF) spectrometry is rapidly becoming a standard piece of equipment for archaeologists. Even though the use of pXRF is becoming standard, the publication of pXRF analytical methods and the resulting data remains widely variable. Despite validation studies that demonstrate the importance of sample preparation, data collection settings, and data processing, there remains no standard for how to report pXRF results. In this article, we address the need for best practices in publishing pXRF analyses. We outline information that should be published alongside interpretive results in any archaeological application of pXRF. By publishing this basic information, archaeologists will increase the transparency and replicability of their analyses on an inter-analyst/inter-analyzer basis and provide clarity for journal editors and peer reviewers on publications and grant proposals for studies that use pXRF. The use of these best practices will result in better science in the burgeoning use of pXRF in archaeology.

Con la promesa de un procesamiento no destructivo, muestreo rápido y económico y la portabilidad a cualquier sitio de campo o museo en el mundo, sistemas portátiles de fluorescencia de rayos X (pFRX) se está convirtiendo rápidamente en equipo estándar para arqueólogos. Mientras el uso de pFRX se está volviendo estándar, la publicación de métodos analíticos pFRX y los datos resultantes siguen siendo muy variables. A pesar de estudios de validación que han demostrado la importancia de la preparación de muestras, la configuración de la recopilación de datos, y el procesamiento de datos, permanece sin estándar para reportar los resultados pFRX. En este articulo, abordamos la necesidad de mejores prácticas en la publicación de análisis pFRX. Describimos la información que debe publicarse junto con los resultados interpretados en cualquier aplicación arqueológica de pFRX. El uso de estas mejores prácticas dará como resultado una mejor ciencia en el floreciente uso de pFRX en arqueología. Al publicar esta información básica, los arqueólogos van a aumentar transparencia y la replicabilidad de sus análisis entre analistas y entre analizador es y brindarán claridad a los editores/editoras y revisores sobre publicaciones y propuestas de subvenciones para estados que emplean pFRX.

Type
How to Series
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2024. Published by Cambridge University Press on behalf of Society for American Archaeology

With the promise of nondestructive processing, rapid low-cost sampling, and portability to any field site or museum in the world, it is not surprising that portable X-ray fluorescence (pXRF) spectrometry is rapidly becoming a standard piece of equipment for archaeologists. The pros and cons of pXRF as a technology for archaeology have been well documented elsewhere (see Feinman and Riebe Reference Feinman and Riebe2022; Frahm and Doonan Reference Frahm and Doonan2013; Hunt and Speakman Reference Hunt and Speakman2015; Millhauser et al. Reference Millhauser, Rodríguez-Alegría and Glascock2011; Shackley Reference Shackley2010, Reference Shackley2012; Tykot Reference Tykot2016; Williams et al. Reference Williams, Taylor and Orr2020). In this article, we turn our attention from the analytical potential of pXRF to the publication and dissemination of pXRF analyses.

Although the use of pXRF is becoming standard, the publication of pXRF analyses remains widely variable. In an early synthesis of pXRF in archaeology, Shackley (Reference Shackley and Steven Shackley2011:40) claimed that the publication of the instrumental settings and analytical strategies used is as important as disseminating the results of the pXRF analyses themselves. Including the processing and instrumental parameters used in a study in a manuscript is essential for archaeologists to be able to evaluate the strengths and weaknesses of another's work and the ultimate conclusions they arrive at throughout their research process. Clearly articulated experimental parameters may make results more replicable and comparable between studies. As Speakman and Shackley (Reference Speakman and Shackley2013) argue, the validity of pXRF analyses cannot be established unless archaeologists provide the protocols used in the analysis. However, a decade later, published studies continue to routinely come up short in this regard.

Despite validation studies that have demonstrated the importance of sample preparation, data collection settings, and data processing to the quality of the findings, there remains no standard for how to report pXRF results. In other rapidly expanding areas of archaeological science, such as Bayesian chronological modeling, scholars have called for clear publication standards (see Hamilton and Krus Reference Hamilton and Krus2018). As we show, there is a similar need to establish protocols when publishing pXRF data.

In this article, we explore the risks of continuing to use a rapidly adopted methodological technique without a standard way of publishing results. We propose a list of information that should be published alongside interpretive results in any archaeological application of pXRF. Implementing these practices will result in better science in the burgeoning use of pXRF in archaeology. By publishing this basic information, archaeologists will increase transparency and replicability of their analyses and provide clarity for journal editors and peer reviewers on publications and grant proposals for studies that use pXRF.

THE FLUORESCENCE OF pXRF ANALYSES

We are experiencing a “fluorescence” of pXRF publications in archaeology. In archaeology, after some early applications in the 1990s and 2000s, there has been a rapid and continuous expansion of the use of pXRF in publications since 2010 (Figure 1). Not only is the rate of publication of pXRF studies increasing but also pXRF is being used to examine the elemental composition of an increasingly diverse array of material culture. Although pXRF is most frequently used to analyze lithics—particularly fine-grained volcanics like obsidian and basalt— it has also been applied to the analysis of sediments, ceramics, rock art, and pigments.

FIGURE 1. Number of publications with archaeological applications of pXRF analysis per year from 1999 through 2021.

For this study, we examined 230 articles that presented the results of pXRF analyses in archaeology (Supplemental Text 1). These articles were collected through an extensive literature review that included pXRF-related articles published through 2021 from seven major journals (Journal of Archaeological Science, Journal of Archaeological Science: Reports, Journal of Field Archaeology, Archaeometry, American Antiquity, Antiquity, and Journal of Anthropological Archaeology), as well as other papers from similarly relevant sources. For each paper, we recorded information on the pXRF analyses, including whether and how the study presented information on 18 key variables (Supplemental Table 1).

NOT JUST POINT-AND-SHOOT

pXRF instruments are deceptively easy to use: they have become popular because of their speed, efficiency, and cost effectiveness. However, “pXRF is not a magic gun that can provide answers to any and all research questions related to elemental composition” (Koenig et al. Reference Koenig, Castañeda, Boyd, Rowe and Steelman2014:168). A pXRF instrument has myriad settings and conditions that could influence the results of an experiment from the make of the instrument, to the power of the beam, to the count time and calibration. For example, power (kV and μA) settings can affect the quality of data at different sections of the periodic table. Generally, the most accurate data for elements with low atomic numbers (e.g., Mg, Si, P) are obtained with low kV and high μA, and vice versa for elements with high atomic numbers (e.g., Zr, Sr). Intermediate kV and μA settings for intermediate atomic number elements (e.g., Ti, Ca) provide optimal data. Power settings are available on most instruments either on the handheld display before or during the analysis or can be found in the manufacturer's specifications for each analytical program. In addition, although there are similarities in the technology used by different manufacturers, there are also distinct differences in how instruments operate, the filters and other components used, and how data are reported.

Goodale and colleagues (Reference Goodale, Bailey, Jones, Prescott, Scholz, Stagliano and Lewis2012) showed in a study of inter-instrument variability that different makes and models of pXRF instrument produce statistically different results. Speakman and Shackley (Reference Speakman and Shackley2013) demonstrated that there is significant variability in the data produced by the three pXRF instruments (Niton, Innov-X, and Bruker) used in most archaeological research. The power source for the pXRF, such as the type and strength of a battery, also affects the results of the pXRF instrument (Goodale et al. Reference Goodale, Bailey, Jones, Prescott, Scholz, Stagliano and Lewis2012). Newlander and coworkers (Reference Newlander, Goodale, Jones and Bailey2015) demonstrated that counting error is reduced with an increase in count time, necessitating the reporting of count time. Count rates are important because the lower they are the longer the signal needs to be counted to acquire useful precision in the measurement. Measurement precision scales—that is, improves—with the square root of the total number of counts. So, there is an order of magnitude difference in the precision of a 100 count per second rate measurement and a 10,000 count/second rate measurement if both are counted for the same length of time. Samples with a smooth face to scan generate more reliable results, but nonhomogeneous samples create variable spectral line intensities, making concentration estimations more inaccurate: this means that transparency in preparation methods is crucial to the accurate replication of experimentation (Jones et al. Reference Jones, Bailey and Beck1997; LeMoine and Halperin Reference LeMoine and Halperin2021). When properly calibrated, pXRF instruments can be both accurate and precise, but different calibrations produce different results, even if the manufacturer is the same (Conrey et al. Reference Conrey, Goodman-Elgar, Bettencourt, Seyfarth, Van Hoose and Wolff2014).

We identified 18 pieces of information that can affect the replicability and reproducibility of pXRF analyses (Table 1). These factors can be divided into information about (1) project design, (2) sample preparation, and (3) analysis conditions, including instrument settings and object attributes.

TABLE 1. Factors That Can Affect the Replicability and Reproducibility of pXRF Analyses.

INFORMATION FREQUENTLY MISSING FROM PUBLICATIONS

Despite the numerous factors that influence pXRF results, our review of the published literature demonstrates that there is a high degree of variability in whether and how archaeologists include these parameters in publications (Figure 2). On the positive side, some information is nearly universally included. The model of pXRF was reported 94.3% of the time, and the composition of samples was reported in 100% of papers: this was the only factor with a 100% inclusion rate.

FIGURE 2. Percentage of publications in our sample that explicitly address the 18 key factors for the replicability and reproducibility of pXRF analyses (blue) and the percentage of those that do not present this information (red). Data derived from 230 publications (see Supplemental Text 1 and Supplemental Table 1).

However, some variables, especially those surrounding experimental conditions, are less reliably reported. For example, 93% of papers do not include information on the source of power (battery life or wall plug) for the pXRF during the analysis. Count time was omitted in 17.8% of papers. The preparation method for samples was only described in 42.2% of publications, despite the importance of this information. The amperage of the pXRF instrument was only reported in 47% of papers. The location where samples were scanned was reported 27.4% of the time. Calibration information was only included in 66.5% of papers.

Although the number of pXRF publications has increased since 2015, the rates of publishing key information have worsened over time. Figure 3 shows the change between the percentage of publications that present the necessary variables from 1999–2015 (75 papers) and from 2016–2021 (155 papers). In recent years, half the pieces of information that can affect the replicability and reliability of pXRF analyses were less likely to be included than they were before 2016. Only five variables were more common in more recent publications, and four were nearly unchanged. It is very concerning that the rate of presentation of research parameters in pXRF publications has worsened over time.

FIGURE 3. Change in the percentage of publications that explicitly address the 18 key factors for the replicability and reproducibility of pXRF analyses from 1999–2015 (75 papers) to 2016–2021 (155 papers). Five factors improved (blue bar), nine factors worsened (red bar), three factors minimally changed (<1% change; gray bar), and one factor had the same percentage during both time periods (gray dot). Data derived from 230 publications (see Supplemental Text 1 and Supplemental Table 1).

BEST PRACTICES FOR PUBLISHING pXRF ANALYSES

We argue that the omission of important data in publications is due to a lack of standardized best practices for publishing pXRF data. Authors, editors, and reviewers alike are either unaware of the importance of presenting these parameters or are concerned that these technical details may take away from the clarity of their arguments. Because word count is at a premium in papers, we understand the reluctance to include every analytical detail. However, a large amount of information can be expressed in just over 100 words in the text—and more expansively in supplemental information and as metadata associated with the data in a digital repository. To improve the replicability and reproducibility of pXRF studies, we developed a guide to best practices for succinctly presenting the relevant research design, instrument settings, and object characteristics for any pXRF analysis. We formulated this into a checklist (Table 2; Supplemental Text 2) and can be adapted to any pXRF experiment and appended to any future publication. Next, we provide two examples of paragraphs (see also Supplemental Text 2) that include every necessary detail regarding pXRF experimentation as briefly as possible.

This is a hypothetical lab-based experiment.

For this experiment, a Bruker Tracer III-V was used in a laboratory benchtop setup, plugged into a 120v wall outlet employing a beam diameter of 3 mm. Samples were scanned for 60 s each at 40 kV, 100μA. Samples included 30 ceramic sherds large enough to be scanned (sizes varied from 5–8 cm by 2–7 cm wide) and had quartz tempers. A face on each sherd was ground down using 800 silicon carbide grit and laid directly on the pXRF sample window. Each sample was scanned three times in the same position. Analytical precision was validated with an in-house ceramic standard (contact corresponding author for access). Bruker's “green” filter was used; no vacuum or helium flow was used. A custom calibration for ceramics and ARTAX software processed the data.

This example includes all 18 factors and some additional information in fewer than 125 words. This information, even expressed concisely, has numerous benefits for the researchers, readers, and reviewers alike. Although this is the minimum information required for replicability, authors are encouraged to provide as much detail as possible about analytical methods and data curation. For example, they can indicate where samples are curated, whether they are available for reanalysis by other researchers, how to access digital copies of the data, and where the pXRF instrument used in this analysis is located. We should note that most pXRF instrument manufacturers employ different proprietary instrument specifications that are not shared publicly. For example, manufacturers do not reveal the mathematical codes they employ or the materials they use to calibrate the devices. Despite those differences, the 18 variables could be standardized regardless of the instrument being used.

TABLE 2. Checklist of Questions That Represent the Minimum Information Necessary When Publishing the Results of a pXRF Experiment.

a Only needed for field-based data collection.

Field-based applications may require additional information to account for how environmental conditions may affect the consistency and quality of data. An example paragraph based on a previous study (Goodale et al. Reference Goodale, Bassett, Bailey, Lash and Kuijt2018) is presented next:

This is a hypothetical field-based experiment (drawn from Goodale et al. Reference Goodale, Bassett, Bailey, Lash and Kuijt2018).

Field-based analysis was conducted using an Innov-X (now Olympus) Omega series spectrometer, operated out-of-the-box in factory preset soil and two-beam mining modes, powered by ND2017A34 Li-ion batteries, with a beam diameter of 1 cm. Cross-stones, made of a range of geological materials from sedimentary limestones to high-grade metamorphic schists (n = 49), and comparative source stone outcrops (n = 23) were analyzed. Each sample was analyzed twice in the same position on the sample, once in soil mode and once in two-beam mining mode, to provide measurements across the suite of elements detectable by the Omega pXRF instrument. In soil mode, samples were scanned for 60 s each without vacuum up to 40 kV, 100μA. In two-beam mining mode, samples were scanned for a total of 120 s (60 s per beam) under vacuum up to 40 kV, 100μA. Each cross-stone sample was surveyed for a clean and dry surface for analysis; no other sample preparation occurred. Analytical precision was validated with an in-house fine-grain rock standard (contact corresponding author for access). The Omega instrument uses proprietary Innov-X software to process acquired data with a factory-built calibration based on the Fundamental Parameters method. Field analysis occurred during June 2009 and 2010 with atmospheric conditions during analysis ranging from sunny to cloudy, 60%–85% humidity, 50–70 oF, and no active falling precipitation. An instrument specific “soil foot” was used to stabilize the pXRF instrument to ensure it did not move during analysis.

A final step in promoting the replicability and reproducibility of is to make pXRF data available to other scholars. Digital repositories such as the Digital Archaeological Record (McManamon et al. Reference McManamon, Kintigh, Ellison and Brin2017) and Open Context (Kansa Reference Kansa2010) provide stable and persistent identifiers that can complement sharing data through supplemental information attached to articles. By following an open science model (Marwick et al. Reference Marwick, Jade d'Alpoim Guedes, Bates, Baxter, Bevan and Bollwerk2017), archaeologists adopting pXRF technologies can work together to improve methods and interpretations.

BENEFITS OF BEST PRACTICES

Transparency and technical details are requirements in archaeological science for replicability and reproducibility. If results vary between studies, might that be due to experimental conditions and sample prep? Or was it due to the object of our interest: underlying differences in human behavior? By controlling for experimental conditions, we make it more likely to be able to identify patterned human behavior, which is ultimately the subject of concern for archaeology.

An established program of best practices in publishing pXRF results will also ensure accountability on the part of the researchers. The ease with which pXRF results can be generated provides a temptation to gather data without considering how instrument details and setup, sample prep, and experimental conditions might affect those results. By having to articulate the parameters of data collection, archaeologists will be better informed about the choices they make when generating pXRF data.

These best practices may also encourage archaeologists without a background in geosciences or analytical chemistry to incorporate a specialist as part of their collaborative research team. The expertise offered by someone well versed in XRF and pXRF technology can enhance the quality of data collection and interpretation.

The clear articulation of the sample prep and data collection decisions described here will also provide clarity for journal editors and peer reviewers, both for publications and for grants that use pXRF. A section devoted to experimentally necessary information could provide the material that editors and reviewers need to make informed decisions about research.

CONCLUSION

Under the right conditions, pXRF can be an accurate and cost-effective method to analyze the chemical makeup of archaeological materials. However, the conditions by which a pXRF instrument is bounded must be reported, per the best practices elaborated on in this article. A pXRF instrument is a powerful tool, but analyses must be taken carefully and with precise and replicable parameters.

The dataset we compiled represents a section of papers and provides an overview of factors essential to pXRF analysis. Given the variation in the reporting of these factors, we hope our dataset, supplementary materials, and this focus article will serve to encourage further discussion regarding best practices. We also hope this analysis and these materials will bring researchers’ attention to the comprehensive inclusion of pXRF parameters in their publications.

Acknowledgments

This research was made possible through support of the Hamilton Analytical Laboratory and the Hamilton College Dean of Faculty Office. Previous research through the Hamilton Analytical Laboratory, especially by David Bailey, George T. Jones, and Khori Newlander, was instrumental in developing this project. We would especially like to thank David Bailey, Catherine Beck, Lacey Carpenter, and Hannah Lau for conversations and comments on earlier versions of this research. We are also grateful to the three anonymous reviewers for their insightful comments and suggestions on an earlier version of this article. All errors and omissions remain the responsibilities of the author. No permits were required for this research.

Funding Statement

This research received no specific grant funding from any funding agency, commercial or not-for-profit sectors.

Data Availability Statement

Data employed in this study are included in the Supplemental Material, which is also available through the Digital Archaeological Record: Supplemental Material—AAP—Best Practices for Publishing pXRF Analyses (Johnson et al. 2024); (tDAR id: 492611).

Competing Interests

The authors declare none.

Supplemental Material

To view supplemental material for this article, please visit https://doi.org/10.1017/aap.2024.6.

Supplemental Table 1. Database of publications with archaeological pXRF analyses used in this study.

Supplemental Text 1. References cited in Supplemental Table 1.

Supplemental Text 2. Template of best practices for publishing pXRF analyses.

References

REFERENCES CITED

Conrey, Richard M., Goodman-Elgar, Melissa, Bettencourt, Nichole, Seyfarth, Alexander, Van Hoose, A., and Wolff, John A.. 2014. Calibration of a Portable X-Ray Fluorescence Spectrometer in the Analysis of Archaeological Samples Using Influence Coefficients. Geochemistry: Exploration, Environment, Analysis 14:291301. https://doi.org/10.1144/geochem2013-198.Google Scholar
Feinman, Gary M., and Riebe, Danielle J. (editors). 2022. Obsidian across the Americas: Compositional Studies Conducted in the Elemental Analysis Facility at the Field Museum of Natural History. Archaeopress, Oxford.10.32028/9781803273600CrossRefGoogle Scholar
Frahm, Ellery, and Doonan, Roger C. P.. 2013. The Technological versus Methodological Revolution of Portable XRF in Archaeology. Journal of Archaeological Science 40(2):14251434. https://doi.org/10.1016/j.jas.2012.10.013.CrossRefGoogle Scholar
Goodale, Nathan, Bailey, David G., Jones, George T., Prescott, Catherine, Scholz, Elizabeth, Stagliano, Nick, and Lewis, Chelsea. 2012. pXRF: A Study of Inter-instrument Performance. Journal of Archaeological Science 39(4):875883. https://doi.org/10.1016/j.jas.2011.10.014.Google Scholar
Goodale, Nathan, Bassett, Madeleine, Bailey, David G., Lash, Ryan, and Kuijt, Ian. 2018. Early Medieval Seascapes in Western Ireland and the Geochemistry of Ecclesiastical Cross Stones. Journal of Archaeological Science 19:894902.Google Scholar
Hamilton, W. Derek, and Krus, Anthony M.. 2018. The Myths and Realities of Bayesian Chronological Modeling Revealed. American Antiquity 83(2):187203.10.1017/aaq.2017.57CrossRefGoogle Scholar
Hunt, Alice M. W., and Speakman, Robert J.. 2015. Portable XRF Analysis of Archaeological Sediments and Ceramics. Journal of Archaeological Science 53:626638. https://doi.org/10.1016/j.jas.2014.11.031.CrossRefGoogle Scholar
Jones, George T., Bailey, David G., and Beck, Charlotte. 1997. Source Provenance of Andesite Artefacts Using Non-Destructive XRF Analysis. Journal of Archaeological Science 24:929943.10.1006/jasc.1996.0172CrossRefGoogle Scholar
Kansa, Eric C. 2010. Open Context in Context: Cyberinfrastructure and Distributed Approaches to Publish and Preserve Archaeological Data. SAA Archaeological Record 10(5):1216.Google Scholar
Koenig, Charles W., Castañeda, Amanda M., Boyd, Carolyn E., Rowe, Marvin W., and Steelman, Karen L.. 2014. Portable X-Ray Fluorescence Spectroscopy of Pictographs: A Case Study from the Lower Pecos Canyonlands, Texas. Archaeometry 56(S1):168186. https://doi.org/10.1111/arcm.12060.CrossRefGoogle Scholar
LeMoine, Jean-Baptiste, and Halperin, Christina T.. 2021. Comparing INAA and pXRF Analytical Methods for Ceramics: A Case Study with Classic Maya Wares. Journal of Archaeological Science: Reports 36:102819. https://doi.org/10.1016/j.jasrep.2021.102819.Google Scholar
Marwick, Ben, Jade d'Alpoim Guedes, C. Michael Barton, Bates, Lynsey A., Baxter, Michael, Bevan, Andrew, Bollwerk, Elizabeth A., et al. 2017. Open Science in Archaeology. SAA Archaeological Record 17(4):814.Google Scholar
McManamon, Francis P., Kintigh, Keith W., Ellison, Leigh Anne, and Brin, Adam. 2017. tDAR: A Cultural Heritage Archive for Twenty-First-Century Public Outreach, Research, and Resource Management. Advances in Archaeological Practice 5(3):238249.10.1017/aap.2017.18CrossRefGoogle Scholar
Millhauser, John K., Rodríguez-Alegría, Enrique, and Glascock, Michael D.. 2011. Testing the Accuracy of Portable X-Ray Fluorescence to Study Aztec and Colonial Obsidian Supply at Xaltocan, Mexico. Journal of Archaeological Science 38(11):31413152. https://doi.org/10.1016/j.jas.2011.07.018.CrossRefGoogle Scholar
Newlander, Khori, Goodale, Nathan, Jones, George T., and Bailey, David G.. 2015. Empirical Study of the Effect of Count Time on the Precision and Accuracy of pXRF Data. Journal of Archaeological Science: Reports 3:534548. https://doi.org/10.1016/j.jasrep.2015.07.007.Google Scholar
Shackley, M. Steven. 2010. Is There Reliability and Variability in Portable X-Ray Fluorescence Spectrometry (PXRF)? SAA Archaeological Record 10(5):1720.Google Scholar
Shackley, M. Steven. 2011. An Introduction to X-Ray Fluorescence (XRF) Analysis in Archaeology. In Geoarchaeology, edited by Steven Shackley, M., pp. 744. Springer, New York.Google Scholar
Shackley, M. Steven. 2012. Portable X-Ray Fluorescence Spectrometry (pXRF): The Good, the Bad, and the Ugly. Archaeology Southwest Magazine 26(2). https://www.archaeologysouthwest.org/pdf/pXRF_essay_shackley.pdf, accessed March 30, 2024.Google Scholar
Speakman, Robert J., and Shackley, M. Steven. 2013. Silo Science and Portable XRF in Archaeology. Journal of Archaeological Science 40:14351443.10.1016/j.jas.2012.09.033CrossRefGoogle Scholar
Tykot, Robert H. 2016. Using Nondestructive Portable X-Ray Fluorescence Spectrometers on Stone, Ceramics, Metals, and Other Material in Museums: Advantages and Limitations. Applied Spectroscopy 70(1):4256. https://doi.org/10.1177/0003702815616745.CrossRefGoogle ScholarPubMed
Williams, Rhys, Taylor, Gillian, and Orr, Caroline. 2020. pXRF Method Development for Elemental Analysis of Archaeological Soil. Archaeometry 62(6):11451163. https://doi.org/10.1111/arcm.12583CrossRefGoogle Scholar
Figure 0

FIGURE 1. Number of publications with archaeological applications of pXRF analysis per year from 1999 through 2021.

Figure 1

TABLE 1. Factors That Can Affect the Replicability and Reproducibility of pXRF Analyses.

Figure 2

FIGURE 2. Percentage of publications in our sample that explicitly address the 18 key factors for the replicability and reproducibility of pXRF analyses (blue) and the percentage of those that do not present this information (red). Data derived from 230 publications (see Supplemental Text 1 and Supplemental Table 1).

Figure 3

FIGURE 3. Change in the percentage of publications that explicitly address the 18 key factors for the replicability and reproducibility of pXRF analyses from 1999–2015 (75 papers) to 2016–2021 (155 papers). Five factors improved (blue bar), nine factors worsened (red bar), three factors minimally changed (<1% change; gray bar), and one factor had the same percentage during both time periods (gray dot). Data derived from 230 publications (see Supplemental Text 1 and Supplemental Table 1).

Figure 4

TABLE 2. Checklist of Questions That Represent the Minimum Information Necessary When Publishing the Results of a pXRF Experiment.

Supplementary material: File

Johnson et al. supplementary material 1

Johnson et al. supplementary material
Download Johnson et al. supplementary material 1(File)
File 61.7 KB
Supplementary material: File

Johnson et al. supplementary material 2

Johnson et al. supplementary material
Download Johnson et al. supplementary material 2(File)
File 51.4 KB