Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-27T11:31:18.778Z Has data issue: true hasContentIssue false

The Breakdown of Effective Field Theory in Particle Physics: Lessons for Understanding Intertheoretic Relations

Published online by Cambridge University Press:  18 September 2024

Adam Koberinski*
Affiliation:
Rotman Institute of Philosophy, Western University, London, ON, Canada
Rights & Permissions [Opens in a new window]

Abstract

Effective field theory (EFT) is a computationally powerful theoretical framework, finding application in many areas of physics. The framework, applied to the Standard Model of particle physics, is even more empirically successful than our theoretical understanding would lead us to expect. I argue that this is a problem for our understanding of how the Standard Model relates to some successor theory. The problem manifests as two theoretical anomalies involving relevant parameters: the cosmological constant and the Higgs mass. The persistent failure to fix these anomalies from within suggests that the way forward is to go beyond the EFT framework.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of the Philosophy of Science Association

1. Introduction

Intertheoretic relations play a critical role in our understanding of science. They are central to issues of the (dis)unity of science, reduction, emergence, theory succession, realism, and the historical development of science. But giving a general account of how two theories can or should relate to one another has proven to be a difficult task. Instead, philosophers have tended to focus on particular theories and their relations with other nearby theories, articulating the limiting relations or the conditions under which their domains overlap. While the Nagelian tradition remains as a dominant exception to this trend in the context of reduction (Dizadji-Bahmani Reference Dizadji-Bahmani, Knox and Wilson2021), working out the details of reduction relations still requires a high degree of specificity about particular theories.

Recently, however, the effective field theory (EFT) framework in quantum field theory (QFT) has provided resources for articulating a wide range of intertheoretic relations in physics, and a conceptual template for how to think about intertheoretic relations in science more generally. The renormalization group provides a mechanism for relating distinct EFTs, and provides a compelling formalism for weak emergence that is compatible with a precise, quantitative account of reduction (Wallace Reference Wallace2019; Knox and Wallace Reference Knox and Wallace2023). Further, one can use the renormalization group scaling behavior—a central ingredient in the EFT framework—to make estimates about where our current best theories will break down. This machinery comes ready-made from the foundations of particle and condensed matter physics, and seems like a promising tool for reconceptualizing several issues in the philosophy of science.

For these reasons, there has been growing philosophical interest in the EFT perspective on QFTs. Philosophers have recently argued that the EFT perspective forces us to reconsider our traditional understanding of interpreting physical theories, including consequences for scientific realism, theory semantics, and the nature of intertheoretic relations (Dougherty Reference Dougherty2023; Franklin Reference Franklin2020; Koberinski and Fraser Reference Koberinski and Fraser2023; Miller Reference Miller2021; Rivat and Grinbaum Reference Rivat and Grinbaum2020; Rivat Reference Rivat2021; Wallace Reference Wallace2019; Williams Reference Williams2019). These are often far-reaching claims made about understanding scientific theories as a whole, though the focus is usually on particle physics and the QFTs used there. Others have pushed back on this optimism, arguing that there are clear limits to the applicability of EFTs (Koberinski and Smeenk Reference Koberinski and Smeenk2023; Rosaler and Harlander Reference Rosaler and Harlander2019), or that EFTs do not suffice to solve these conceptual problems as advertised (Ruetsche Reference Ruetsche, French and Saatsi2020). While inspired by the specific machinery of EFT and the renormalization group from within QFT, the conclusions are often taken to have broader impact for how we should understand physical theories and the relationships between them, or even for how we should think of the semantics of scientific theories more generally. In some sense, the lessons taken from EFT are meant to make our most fundamental physical theories more like the higher-level, less fundamental theories in physics and elsewhere in science.

In order for the broader conclusions to be plausibly motivated by EFTs, one must first check that they hold in the more restricted domains where EFT naturally applies. The main insight from EFT is the fact that we are forced to understand our best QFTs as effective descriptions of physics within some specified range of energy scales. The theoretical machinery used to describe the most fundamental constituents of matter and their non-gravitational interactions breaks down at finite energy scales, suggesting that even by its own lights, the Standard Model of particle physics cannot be understood as even a candidate for a universally applicable theory. Insofar as our most fundamental theories of physics suggest that they cannot be truly fundamental, this undercuts one major motivation to interpret any current theory as though it could be a complete description of some possible world like ours. McKenzie (Reference McKenzie2019) has argued that this poses a particular problem for naturalistic metaphysics, perhaps forcing a reconception of how we understand theoretical and metaphysical content for theories. But this problem also presents an opportunity in the form of novel metaphysics informed by EFT.

Effective field theories are formulated with an ultraviolet (high-energy) cutoff, at which the predictive power of the theory breaks down. The machinery of the renormalization groupFootnote 1 provides a way to understand how the couplings in an EFT change when probed at different energy scales, and how a given EFT relates to further QFTs defined at higher energy scales. In particular, the renormalization group provides a robust sense of theoretical equivalence between a more fundamental QFT restricted to low energies, and an EFT defined only at those low energy scales. Though EFTs were originally thought of as effective descriptions of a known theory applicable at higher energies, one can consider an EFT as an independent theory in its own right, albeit one with a restricted domain of applicability. This “bottom-up” EFT perspective can be applied to any QFT, and in particular is applied to our current best theory of particle physics, the Standard Model.Footnote 2 The EFT framework is applied successfully to many theories in condensed matter physics and particle physics, where quantitative relationships between an EFT and its successor can be derived. The success of this perspective encourages a general view that we take any QFT to be an EFT, and motivates a unified account of particle physics and condensed matter theories under one framework (Wallace Reference Wallace, Knox and Wilson2021). In particular, we take the Standard Model to be fully described as an EFT (SMEFT), whose renormalizable terms make up the dominant low-energy contributions to the full EFT. We therefore expect that the Standard Model is merely an effective description of physics at some energy scale, to be replaced by some new EFT or some new theory outside the EFT framework, but compatible with its assumptions.

With this perspective, one can also run the renormalization group analysis to place justified estimates on the energy scale at which new physical effects are detectable, and where we should therefore expect the SMEFT to break down. This scaling behavior can be used to successfully predict the breakdown scales for other EFTs in particle physics and condensed matter physics, so the SMEFT should be no different. If the Standard Model is an EFT, and its successor has natural coupling constants, then the EFT and renormalization group techniques that dictate intertheoretic relations elsewhere in particle physics and condensed matter physics tell us that the SMEFT should start to break down at scales currently accessible by the Large Hadron Collider (LHC). Yet so far, no evidence of new physics has been discovered, leading to somewhat of a crisis in particle physics. The failure of predictions based on scaling is an anomaly for the EFT framework, and is theoretical in nature: While the Standard Model continues to have the resources to describe and fit the incoming data, the lack of new physics indicates some flaw in our understanding of the structure of the SMEFT and its relation to more fundamental theories. The two major anomalies that I will discuss here are the hierarchy problem (in the form of the Higgs mass) and the cosmological constant problem. Both of these problems indicate a failure of renormalization group scaling to predict the breakdown scale with respect to relevant couplings in the SMEFT. Candidate solutions to these problems often invoked “beyond Standard Model” physics, but retained the EFT framework. Solutions like supersymmetry involved finding evidence of new particles or new forces near same energy scale as the Higgs. The surprising lack of discovery of beyond Standard Model physics at the LHC has made this class of solutions increasingly implausible. I suggest that the correct lesson to draw is that future physical theories beyond the Standard Model will also have to move beyond the EFT and naturalness paradigm.

While many of the important general philosophical lessons learned from studying EFTs are not jeopardized by the failures of naturalness for the Standard Model, any conclusions about intertheoretic relations between the SMEFT and its successor theory will require revision. Arguments claiming that the renormalization group provides tools for a prospective realism depend crucially on the EFT framework applying at energy scales well above those probed at the LHC thus far (Fraser Reference Fraser, French and Saatsi2020; Williams Reference Williams2019). An influential criticism of renormalization group realism (also called effective realism) due to Ruetsche (Reference Ruetsche2018) has a similar form to the criticism levied here. Ruetsche argues that a prospective realism based on EFTs and the renormalization group must assume that the successor theory fits within the EFT framework. The security of ontological commitment to aspects of the SMEFT is tied to the generality of the EFT framework within which renormalization group flow is defined. Ruetsche’s point is that we don’t have any guarantee that a successor theory will fall within that framework; my argument is that theoretical anomalies within the current theory give us good reason to expect that the successor will not fall within the EFT framework.

Also under threat is the claim that our most fundamental theories of physics must be understood as merely effective theories. If some new physics beyond the Standard Model must necessarily go beyond the EFT framework, there is a possibility that a theory describing that physics will look substantially different from the EFTs used today. However, more tempered arguments that, regardless of the form of any given theory, it ought to be understood as effective will still be viable options for those defending an effective view of science and scientific theories. Thus, proponents of effective interpretation strategies ought to focus on the virtues of such views, rather than their necessity. Finally, the failure of EFT methods to predict the breakdown scale of the SMEFT marks a major disanalogy between the use of EFTs in condensed matter physics and particle physics. Despite the fact that the atomic theory of matter underlying condensed matter QFT is not an EFT, EFT methods used in condensed matter physics predict a breakdown of the continuum approximation at energy scales on the order of the inverse atomic spacing.Footnote 3 The anomalies for the SMEFT show that the same is not true in particle physics, marking a disanalogy in how the two disciplines relate EFTs to their successors. This disanalogy is at the level of relating our most fundamental theories in each domain to their successors, and suggests that at least some aspects of the EFT framework should be understood differently in the two disciplines.

I argue that the current failure of the EFT framework—including the renormalization group scaling behavior and some form of reasonable naturalness criterion—to understand relevant parameters in the SMEFT is good evidence to think that a new theory going beyond the Standard Model is unlikely to fall within the EFT framework. This is a surprising result, given the previous success of EFTs as well as the flexibility of the framework. However, the persistent failure to resolve the two major theoretical anomalies—the hierarchy problem and the cosmological constant problem—should be taken to indicate that new conceptual resources are needed that break the assumptions needed to set up the EFT framework. In section 2 I discuss the hierarchy problem (section 2.1) and the cosmological constant problem (section 2.2), highlighting the issues that they pose for treating the Standard Model as an EFT. I will also therefore cover some of the basic terminology and formalism for EFTs. In section 3, I outline four possible responses to these anomalies, and argue that the current state of particle physics strongly suggests that the most radical option of moving beyond EFT methods is the necessary step. Section 4 provides the outlines of a two-pronged strategy for effecting conceptual change in particle physics, one theoretical and the other empirical. Both must work in tandem for progress in moving beyond EFTs to be successful.

2. Theoretical anomalies for Standard Model effective field theory

In order to discuss the two major theoretical anomalies facing the SMEFT, I begin this section with a brief discussion of the essential features of the EFT framework, including scaling behavior as determined by the renormalization group equations. For reasons of space this will necessarily be a brief summary with a focus on the conceptual details necessary for what follows. For a more detailed treatment aimed at philosophers, see Koberinski and Fraser (Reference Koberinski and Fraser2023), Wallace (Reference Wallace, Knox and Wilson2021), and Williams (Reference Williams, Knox and Wilson2021). For more detailed discussions aimed at physicists, see Burgess (Reference Burgess2004), Donoghue (Reference Donoghue2012), Manohar (Reference Manohar2020), and Peskin and Schroeder (Reference Peskin and Schroeder1995).

The key insight from the development of renormalization group methods that led to the EFT framework for QFTs was that many of the nonrenormalizable interaction terms that one could include in a theory are heavily suppressed as one flows down to lower energies. The requirement that theories contain only renormalizable interactions was initially thought necessary to ensure that a theory was properly predictive; nonrenormalizable terms lead to divergences in predictions of physical quantities when regulators are removed from the theory. On the formal side, the renormalization group analysis provided a means of keeping a regulator in the theory, therefore eliminating the divergences from nonrenormalizable terms.

As Butterfield and Bouatta (Reference Butterfield, Bouatta, Bigaj and Wüthrich2015) have argued, one major accomplishment of the renormalization group was to provide a physical motivation for renormalization, and to further justify the presence of a high-energy regulator in formulating EFTs. To illustrate this point, suppose we start with a fully renormalized QFT without a cutoff regulator. If we write the set of fields as $\left\{ {{\phi _i}} \right\} = \left\{ {{\phi _1},{\phi _2}, \ldots, {\phi _n}} \right\}$ , such a theory can be formulated in terms of a generating functional $Z\left[ {{\phi _i}} \right]$ as follows:

(1) $$Z\left[ {{\phi _i}} \right] = \smallint {\scr D}{\phi _i}{\rm{\;\;}}{e^{iS\left[ {{\phi _i}} \right]}},$$

with $S\left[ {{\phi _i}} \right]$ the classical action for the theory and $\smallint {\scr D}{\phi _i}{\rm{\;}}$ the path integral over all field configurations. One way of implementing the renormalization group transformation is to impose a momentum cutoff in the integrals of the generating functional. Separate out the high-momentum field modes $\phi _i^{\rm{h}}$ from the low momentum modes $\phi _i^{\rm{l}}$ by the cutoff ${\rm{\Lambda }}$ —i.e., $\phi _i^{\rm{h}}:{p^2} + {m^2} {{\rm{\Lambda }}^2}$ and $\phi _i^{\rm{l}}:{p^2} + {m^2} \lt {{\rm{\Lambda }}^2}$ . Then, one can perform the path integral over $\phi _i^{\rm{h}}$ to create a new effective theory whose degrees of freedom include only the $\phi _i^{\rm{l}}$ . There will also generically be an additional dependence on ${\rm{\Lambda }}$ in the new generating functional:

(2) $$Z\left[ {\phi _i^{\rm{l}},{\rm{\Lambda }}} \right] = \smallint {\scr D}\phi _i^{\rm{l}}\smallint {\scr D}\phi _i^{\rm{h}}{\rm{\;\;}}{e^{iS\left[ {\phi _i^{\rm{l}},\phi _i^{\rm{h}}} \right]}}$$
(3) $$ = \smallint {\scr D}\phi _i^{\rm{l}}{\rm{\;\;}}{e^{iS\left[ {\phi _i^{\rm{l}},{\rm{\Lambda }}} \right]}}.$$

The new effective theory will in general contain new interaction terms between the $\phi _i^{\rm{l}}$ , some of which may be nonrenormalizable now. But the cutoff scale ${\rm{\Lambda }}$ ensures that these terms do not diverge, and the cutoff here is given a clear physical interpretation. The new EFT only describes degrees of freedom with energy up to ${\rm{\Lambda }}$ . This process can be repeated, shifting infinitesimally from ${\rm{\Lambda }} \to {\rm{\Lambda }} - \delta {\rm{\Lambda }}$ , and allows one to define a renormalization group flow for a set of EFTs from high to low energy. The construction also suggests a recipe for constructing EFTs from the bottom up, without reference to a successor theory.

Start creating an EFT by writing the appropriate field degrees of freedom, and imposing the relevant symmetries on the theory. Then, since the renormalization group transformation will generically change the values of couplings for some terms while introducing new terms, one writes the most general possible Lagrangian containing these fields and respecting these symmetries. This will include an infinite number of terms, each of which has a classical mass dimension. Since a Lagrangian must have a mass dimension of four (in four spacetime dimensions), and each field carries its own mass dimension, coupling strengths $\left\{ {{g_i}} \right\}$ must also have a mass dimension to ensure that the product of fields with that coupling results in a mass dimension of four. We can then define dimensionless coupling constants ${\alpha _i}$ using the cutoff as the only dimensionful quantity in the theory: ${g_i} = {\alpha _i}/{{\rm{\Lambda }}^n}$ , with $n$ some integer needed to cancel out the mass dimension of ${g_i}$ , and with ${g_i}$ the actual coupling term in the Lagrangian. If we specify the fields and symmetries, then we can parameterize a theory space by the set of couplings $\left\{ {{g_i}} \right\}$ ordered from smallest to largest mass dimension. At a given probe energy, an EFT is specified in principle by listing the value of all of the $\left\{ {{g_i}} \right\}$ , corresponding to a point in theory space.

It turns out that, under renormalization group flow from higher to lower energies, the couplings behave in one of three ways. Relevant couplings become larger as one flows to lower energies, while irrelevant couplings become smaller. The former have scaling corrections including positive powers of ${\rm{\Lambda }}$ , while the latter are suppressed by powers of ${\rm{\Lambda }}$ . Marginal couplings neither grow nor shrink.Footnote 4 Running the renormalization group flow in the opposite direction gives the opposite behavior: terms that are relevant under flow to the infrared (IR; low energies) are irrelevant under flow to the ultraviolet (high energies), and vice versa. The scaling behavior of the couplings is given by the beta function,

(4) $$\beta \left( {{g_i}} \right) = {{\partial {g_i}} \over {\partial\ {\rm{ln}}\ \mu }},$$

where $\mu $ is the reference energy scale, usually interpreted as the energy scale at which one is probing the system. Thus, the IR-relevant terms have a negative beta function and IR-irrelevant terms a positive beta function. For an EFT, the only dimensionful parameter available to implement scaling changes is ${\rm{\Lambda }}$ . What makes the EFT framework powerful is that, for QFTs in four-dimensional Minkowski spacetime, terms in the Lagrangian with mass dimension greater than four are all IR irrelevant, while mass dimension four terms are marginal, and less than four are relevant. Since the fields themselves have positive mass dimension, the vast majority of possible terms in the EFT are heavily suppressed at energies $\mu \ll {\rm{\Lambda }}$ , leaving only the marginal and relevant terms. Renormalizable theories are theories with only marginal and IR-relevant terms.

A fully general EFT will flow down to its renormalizable sector as the renormalization group scaling is taken down to energies well below the cutoff scale. For energies $\mu \ll {\rm{\Lambda }}$ , irrelevant couplings quickly approach zero, while only the marginal and relevant couplings remain.Footnote 5 This means that many different theories in the theory space—specified by the values of all couplings at a given energy—will flow down to a smaller subsurface in the space, parameterized by only a small set of coupling parameters. Though in principle an EFT requires an infinite set of empirically determined couplings to be fully specified, if one restricts its domain to energies $\mu \ll {\rm{\Lambda }}$ then only a small number of couplings are required for any desired degree of precision. If one additionally assumes that the values of the couplings in an unknown successor theory are natural (i.e., the dimensionless parameters $\left\{ {{\alpha _i}} \right\}$ are not $ \gg $ or $ \ll 1$ ), then the magnitude of nonrenormalizable terms places bounds on the scale ${\rm{\Lambda }}$ , presumed to be the scale at which new physics occurs. Precision testing of an EFT can reveal the small effects of otherwise hidden nonrenormalizable terms, allowing for an indirect discovery of the breakdown scale for that EFT. Thus, the EFT framework provides a mechanism for placing bounds on the applicability of a given EFT, whether we know its successor or not.

In many applications of the EFT framework, we know the successor theory already. In these cases, we can check that the scale at which the EFT breaks down corresponds to the scales at which some new physics is expected to occur, by lights of the successor theory. A classic example of this is the Fermi theory of weak interactions, an EFT describing four fermion interactions at energies low compared to the mass of the weak gauge bosons. The four-fermion coupling is nonrenormalizable, and the coupling strength for the interaction diverges on the order of 100 GeV. This is also the energy scale at which the W and Z bosons become important to describing weak physics; their masses are roughly 80 GeV and 92 GeV, respectively. Here, the scaling behavior of the EFT gives a close estimate to the energy scales at which to expect new physics. Similar results hold in condensed matter physics, where, e.g., phonon interactions as elementary excitations of the crystalline structure of solids can be given an EFT treatment.

When there are relevant parameters in the EFT, one should not even need sensitivity to nonrenormalizable terms to determine where the theory is expected to break down. Since relevant parameters scale with positive powers of ${\rm{\Lambda }}$ , they should provide low-energy evidence for the scale at which new physics comes in. Provided that naturalness holds, relevant parameters should have their low-energy values proportional to some positive power of ${\rm{\Lambda }}$ , ${g_{{\rm{rel}}}} = {\alpha _{{\rm{rel}}}}{{\rm{\Lambda }}^n}$ . In the SMEFT, there are two important relevant terms: the Higgs boson mass and the vacuum energy density. But both of these terms have a far smaller magnitude than would be suggested by even the most conservative estimates for ${\rm{\Lambda }}$ . The failure of naturalness and scaling for the Higgs mass and vacuum energy density are the two major theoretical anomalies for the EFT framework. As I will argue below, these failures highlight important deficiencies in our understanding of intertheoretic relations using EFTs, and suggest that theoretical progress will likely require a conscious rejection of some of the assumptions required for the EFT framework.

2.1. Hierarchy problem

The first failure of EFT scaling of a relevant parameter in the Standard Model is the Higgs mass. A scalar field $\phi $ in a Lagrangian has dimension of $\left[ {mass} \right]$ , and the kinematic mass term $12 m_{\rm{H}}^2{\phi ^2}$ must have dimension ${[mass]^4}$ , which means that $m_{\rm{H}}^2$ has dimension of ${[mass]^2}$ . Under the renormalization group scaling transformations discussed above, this means that the Higgs mass term ${m_{\rm{H}}}$ should scale linearly with the EFT cutoff scale ${\rm{\Lambda }}$ . The Higgs mass was famously measured at the LHC from about 2012–2014, and was found to have a mass of about 125 GeV (Atlas Collaboration 2012; CMS Collaboration 2012). Given even modest naturalness constraints, this value of the Higgs mass suggests that new, beyond Standard Model, physics should be nearby. Strong naturalness constraints suggest ${\rm{\Lambda }}\mathbin{\lower.3ex\hbox{$\buildrel \lt \over {\smash{\scriptstyle\sim}\vphantom{_x}}$}} 1{\rm{\;TeV}}$ , while weaker constraints allow for ${\rm{\Lambda }} \mathbin{\lower.3ex\hbox{$\buildrel \lt\over {\smash{\scriptstyle\sim}\vphantom{_x}}$}} 1{\rm{\;PeV}} = 1000{\rm{\;TeV}}$ .Footnote 6 In any case, naturalness and renormalization group scaling suggest that we should be on the cusp of producing direct evidence for new physics via particle accelerators. This was the hope of physicists in the lead-up to the first LHC run, as it was expected on naturalness grounds that new physics—supersymmetry, technicolor, composite Higgs, or something else entirely—would soon be discovered.

However, before direct production events for new particles, one would expect to see indirect evidence of new physics in the form of modifications to cross-sections at energies near the scale of new physics. If we look at the SMEFT, we should expect some of the lowest mass dimension nonrenormalizable terms to start to make measurable contributions to observables at high energies $\mu \mathbin{\lower.3ex\hbox{$\buildrel \lt\over {\smash{\scriptstyle\sim}\vphantom{_x}}$}} {\rm{\Lambda }}$ , or even to precision measurements at comparatively low energies. So far, however, no persistent anomalies have arisen between the purely renormalizable sector of the Standard Model and any precisely measured observables.Footnote 7 Furthermore, the properties of the Higgs boson have been found (so far) to be entirely compatible with a “vanilla” Higgs, i.e., a fundamental scalar field with spin 0 and no color or electric charge. By the very same EFT reasoning that suggests ${\rm{\Lambda }} \mathbin{\lower.3ex\hbox{$\buildrel \lt\over {\smash{\scriptstyle\sim}\vphantom{_x}}$}} 1$ $1000{\rm{\;TeV}}$ , focus on the precision agreement between measurement and the renormalizable Standard Model suggests that ${\rm{\Lambda }} \gg 1000{\rm{\;TeV}}$ . Thus there is a direct conflict in predictions for the value of ${\rm{\Lambda }}$ , both coming from within the EFT framework.

The conflict here only arises in thinking about the intertheoretic relationships implied by the EFT framework. If we think about problems of naturalness or fine-tuning with the Standard Model as a standalone theory, and without the renormalization group machinery relating it to possible higher-energy theories, then the mass of the Higgs is simply an empirically measured input parameter. The renormalization scale need not be set at some high energy at which physics breaks down—it can instead be set at the energy scales needed for the given application. If we think about the Standard Model on its own, then, as Manohar (Reference Manohar2020, 84) notes, the argument relies “on the sensitivity of low-energy observables to high-energy (short distance) Lagrangian parameters. But treating this as a fundamental problem is based on attributing an unjustified importance to Lagrangian parameters.”Footnote 8 It is only when we justify these Lagrangian parameters as dictated by some successor theory to the Standard Model, and consider the relationships between that scale and the Higgs scale, that there is a clear statement of tension here.Footnote 9

This is one way to state the hierarchy problem for the forces in the Standard Model. The EFT framework suggests that there should be new physics just above the scale of the Higgs boson, but all evidence suggests a “physics desert” between the TeV scale and the Planck scale, where quantum gravity effects become important. The issue here is the conflicting theoretical expectations given by the scaling behavior of the relevant parameter (Higgs mass) and the rest of the observables in the Standard Model. Both expectations are justified via the EFT understanding of intertheoretic relations via scaling. Scaling of the relevant parameter suggests that a new theory is required at accessible energies, while empirical testing indicating scaling of irrelevant parameters suggests the current theory is doing just fine. In a sense, the frustration lies in the fact that our best theory does a better job of capturing the phenomena than we expect it to! But this is a problem for using the framework to understand intertheoretic relations because the SMEFT is failing to delimit its own domain of applicability. A related problem concerns the scaling of vacuum energy density for the Standard Model, when coupled to gravity in an EFT.

2.2. Cosmological constant problem

In many ways, the form of the cosmological constant problem is the same as the hierarchy problem for the Higgs mass. The problem involves the expectation that a relevant parameter in the SMEFT—the expectation value of vacuum energy density $\langle \rho \rangle $ —should scale with some positive power of ${\rm{\Lambda }}$ , while its empirically determined value is significantly smaller. In one sense, the cosmological constant problem is worse, while in another sense it is less of a problem than the Higgs mass. It is worse in that the relevant parameter here scales with ${{\rm{\Lambda }}^4}$ , so the theoretical expectation for the value of $\langle \rho \rangle $ is significantly higher. It is less of a problem in the sense that $\langle \rho \rangle $ plays no role in any observables in the Standard Model. It is only under the assumption that $\langle \rho \rangle $ should gravitate that one can link it to the cosmological constant and the observed accelerating expansion of space.Footnote 10

When treated as a full EFT, where all possible terms consistent with the symmetries of the theory are included, the SMEFT must include a constant term as the first operator in its Lagrangian. This is a term not multiplied by any field operators, and therefore has dimension ${[mass]^4}$ . Such a term already exists in the formalism of QFT: the sum of expectation values of ground-state (vacuum) energy densities corresponding to each of the fields in the Lagrangian. Given its mass dimension, the vacuum energy density term should have its low-energy value set by ${{\rm{\Lambda }}^4}$ , meaning that the value should be enormous, even if ${\rm{\Lambda }}$ is conservatively set to 1 TeV. Unlike the Higgs mass, however, vacuum energy density is not directly measurable in the context of particle physics; only energy differences are meaningful, and since $\langle \rho \rangle $ is a stable zero-point energy, it turns out to play no direct role in calculating any observable quantities.

The problem arises when one includes gravity in an EFT treatment, thereby going beyond the Standard Model and considering intertheoretic relationships with general relativity. In general relativity, all energy gravitates, and it turns out that vacuum energy density of quantum fields contributes to the effective cosmological constant, a term in the Einstein field equations that is currently used by cosmologists to model the accelerated expansion of the universe. For a uniformly expanding universe, the cosmological constant provides an energy density that is everywhere constant, with a negative pressure that leads to expansion. Accelerated expansion is something cosmologists and astrophysicists can infer from the cosmic microwave background data or from certain types of supernovae. Thus, when we combine the Standard Model with general relativity in the EFT framework, we have a way to compare the theoretically expected value of $\langle \rho \rangle $ with an observable quantity determining the rate of acceleration of the universe’s expansion.Footnote 11 This comparison requires us to merge two theories together within the EFT framework, into what Wallace (Reference Wallace2022) calls a theory of low-energy quantum gravity.

The comparison leads to a major theoretical anomaly, and has been called the “worst prediction in all of physics” by Hobson et al. (Reference Hobson, Efstathiou and Lasenby2006). If we trust the renormalization group scaling arguments laid out in section 2, then $\langle \rho \rangle \propto {{\rm{\Lambda }}^4}$ . Regardless of the scale at which new physics comes in, this is an enormous value, especially compared to the measured rate of expansion we observe in the universe. Depending on whether one takes ${\rm{\Lambda }}$ to be $ \propto 1$ TeV or the Planck scale, ${10^{16}}$ TeV, the discrepancy is between 60 and 120 orders of magnitude. While a more dramatic disagreement than the Higgs mass, the same mechanism is to blame. The scaling behavior we expect to hold based on other successful uses of the renormalization group tells us that relevant parameters ought to have low-energy values near the breakdown scale of the EFT, while measurements imply that the value is actually significantly lower.

For the cosmological constant problem, the failure to find evidence of supersymmetry at the LHC dealt a major blow to the best anticipated solution. Supersymmetry is an additional symmetry added to an EFT that transforms bosonic fields into fermionic fields and vice versa. Fermions and bosons contribute vacuum energy density of opposite signs; if supersymmetry were a symmetry of nature then each particle would have a superpartner contributing to $\langle \rho \rangle $ , such that the total contributions would cancel to zero. The failure to find evidence of supersymmetry at the LHC has thus spoiled hopes that new physics within the EFT framework would fix the problem.

In both cases, then, there are relevant parameters in the Standard Model Lagrangian that the EFT framework suggests should have values set by the energy scales of an unknown successor theory. On this basis, the expectation was that the LHC would reveal evidence of physics beyond the Standard Model. But the failure to find new physics has led us to reconsider aspects of the EFT framework, naturalness, or particular candidate solutions. But this much is clear: as it stands, our usual understanding of scaling relationships for relevant parameters leads to theoretical anomalies when applied to the Standard Model. There are several ways to respond to these anomalies; I turn attention to these next.

3. Beyond EFTs and scale separation

Physicists have long been aware of these potential anomalies in the Standard Model. The cosmological constant problem has been widely known in the particle physics and quantum gravity communities since the early 1980s, and while the hierarchy problem has taken different forms, physicists have been positing beyond Standard Model physics to address it since the 1970s. What is new about these problems is their stubborn resistance to the most commonly expected solutions that retain naturalness and the EFT framework. The failure to find new physics near the Higgs mass scale at the LHC heavily disfavored nearly all of these solutions. Though it is still possible that, e.g., supersymmetry is a symmetry of nature, the lower bound on the symmetry-breaking scale is now too high to help with either of the two problems.

The resistance of these anomalies to “standard” solution strategies within the EFT framework poses a major obstacle to the claim that EFTs provide a quantitative, prospective structure for intertheoretic relationships. As I detailed in the previous section, the problem arises for understanding scaling of empirically meaningful relevant parameters in the Standard Model. There are four possible ways to overcome this hurdle, in increasing order of conceptual departure from the EFT framework:

  1. (1) Retain renormalization group scaling and find a restoration of naturalness. This has been the dominant approach for the last few decades, and includes supersymmetry, composite Higgs, technicolor, etc. One could argue that the failure to find a solution is not a problem with the framework, but instead a failure of imagination to construct the right sort of model within the framework.

  2. (2) Reject naturalness, maintain EFT framework and renormalization group for intertheoretic relations. This implies that our indirect access to physics beyond the Standard Model is even harder to obtain than initially thought, since couplings may be unnaturally small to suppress new physics effects in the SMEFT. While this is a plausible empirically minded attitude to take, it leads to a severe form of pessimism for the prospects of successful detection of new physics.

  3. (3) Maintain EFT framework, reject its characterization of intertheoretic relations. Similar to option 2, but instead of rejecting naturalness, reject the characterization of intertheoretic relations. This means that one treats the SMEFT as a standalone theory, giving no insight into a successor. Requires a reinterpretation of renormalization group methods as underwriting scaling for a single theory, and a reinterpretation of “bare” parameters in a Lagrangian.

  4. (4) Reject the EFT framework for intertheoretic relations between the Standard Model and whatever comes next. This response suggests that major conceptual shifts will be needed to resolve the anomalies in SMEFT, and that the EFT framework is inherently limited. Conceptual shifts might come in the form of reformulations of the Standard Model, and they might point the way to further conceptual shifts needed to understand the relationships between the Standard Model and a successor theory like quantum gravity.

These options do not exist solely as responses to scaling issues and intertheoretic relations for the Standard Model; in general they are avenues for constructing candidate theories of physics beyond the Standard Model and may have many independent motivations. Option 1 has been the dominant strategy for constructing models of physics beyond the Standard Model for about 50 years. Before the LHC was operational, it seemed most reasonable to expect that naturalness would continue to be a good guide for constructing EFTs. Models of grand unified theories, supersymmetry, and composite Higgs models were proposed as successors to the SMEFT. But the failure to find empirical evidence to support any of these models at sufficiently low energies to help solve one or both of the problems has made this avenue of response much less attractive. While it is possible that some new model might be thought up that fits the EFT framework with naturalness, the prospects seem dim, since all internal indicators from within the SMEFT suggest that the scale of new physics should be accessible now. Many of the proposed natural extensions of the Standard Model no longer solve these theoretical anomalies, and are postulated due to fits with string theory.

Option 2 is the most common response of theorists and experimentalists since the LHC has been operational. According to this response, we should retain trust in the EFT framework, but give up on naturalness as a criterion for dimensionless couplings. This is a strongly empirical approach; if the framework is still able to account for the phenomena, then there is nothing to worry about. The problem, as discussed above, is that the Standard Model is far more empirically adequate than we have any theoretical reason to expect. So the most conservative option is to reject only the minimal assumptions leading to this theoretical expectation. However, there is still substantial content that we lose by giving up naturalness as a guide to intertheoretic relations. Giudice (Reference Giudice, Kane and Pierce2008; Reference Giudice2017), the head of the theoretical physics division at CERN and a former proponent of naturalness, has since argued that particle physics has entered the post-naturalness era, and that this is something of a Kuhnian crisis period for the discipline. Wallace (Reference Wallace2019) has similarly argued that giving up on naturalness principles generally would lead to problems for thinking about how physical theories relate to each other even outside the scope of particle physics. Indeed, naturalness principles might seem indispensable to the success of reasoning in physics. But nevertheless, at least the narrow form of naturalness employed in EFTs seems to be in jeopardy. By giving up on it, we eliminate the problems as a conflict between theory and experiment. Perhaps the “bare” parameters set by a successor theory, and applicable at energies near the cutoff scale, are simply unnaturally small.

While this response has the advantage of being conservative of the EFT framework, it faces a few problems. First, the formulation of naturalness that one can give up on while retaining the EFT framework is controversial. Naturalness is not something with an uncontested definition. The form that one can reject in this way depends on taking the bare parameters in an EFT Lagrangian—the couplings defined at the cutoff scale ${\rm{\Lambda }}$ —as physically privileged, which is usually justified by assuming the successor theory will have some mechanism to set those parameter values. Then the renormalization group scaling simply reparameterizes those physically privileged values at lower energy scales. By giving up this form of naturalness, the bare parameters lose their meaning as coming from a more fundamental theory via the renormalization group, one might then slide into option 3, which is to reject the EFT characterization of intertheoretic relations altogether. Manohar (Reference Manohar2020) endorses this departure, as do Rosaler and Harlander (Reference Rosaler and Harlander2019). Manohar, in particular, claims that “the only version of the hierarchy problem which does not depend on how experimental observables are calculated … is simply the statement that two masses, [the Higgs and the scale for new physics] are very different” (84). This problem, if it is indeed a problem at all, is not affected by whether or not we retain naturalness as a criterion for relating parameters between EFTs. Williams (Reference Williams2015) also argues that the most justified form of naturalness principle is tightly tied to the assumption of scale separation, essential for the applicability of the EFT framework and its constituent scaling relations. On William’s view of naturalness, one cannot give it up without also rejecting a central pillar in the EFT framework. Without scale separation, one cannot form a principled separation of energy scales within the domain of an EFT and those outside its domain. Without this form of naturalness, one retains a phenomenologically successful SMEFT, but without any of the underlying theoretical support justifying the use of the EFT framework as a whole. Then, the ability to accommodate new discrepancies by including low-order nonrenormalizable terms would appear to be little more than adding new free parameters to a model and tuning to fit new data.

This is where option 3 comes in. If one revises their view of the EFT framework along the lines outlined by Manohar (Reference Manohar2020) and Rosaler and Harlander (Reference Rosaler and Harlander2019), then one can keep naturalness as scale separation, and simply reject the idea that the EFT framework can tell us anything about a Standard Model successor. Franklin (Reference Franklin2020) makes a case against naturalness as explaining the success of EFTs on precisely these grounds. Instead, he takes renormalizability to be the key feature, which is a property of a theory independent of its relation to a successor. Though this view is more principled, it comes with the major cost of undermining the EFT view of intertheoretic relations more generally. For both options 2 and 3, an additional cost of these modifications is the loss of explanatory power regarding the relationship between different EFTs. In particular, prospective realism for the SMEFT based on renormalization group arguments, as in Fraser (Reference Fraser, French and Saatsi2020) and Williams (Reference Williams2019), is undercut. This form of realism depends on the insensitivity of low-energy physics to the values of high-energy parameters. For option 2, rejecting naturalness gives no principled reason why the effects from a high-energy successor are guaranteed to have small impacts on low-energy physics. Option 3 rejects the idea of relating the SMEFT to a successor via the renormalization group altogether, so the stability analysis cannot get off the ground.

Second, the rejection of naturalness (option 2) or intertheoretic links (option 3) leads to more questions than answers. Why does naturalness seem to fail here, when it succeeds nearly everywhere else within the EFT framework? Why does the renormalization group work for relating other EFTs, but fail for the SMEFT? What is it about physics beyond the Standard Model that leads to this unnaturalness? Giving up these principles for the EFT framework as a whole due to their failure for the SMEFT seems too drastic a move, despite its apparent conservatism. Much of the foundational work on the philosophical significance of EFTs makes crucial reference to naturalness or intertheoretic relations; wholesale changes to our understanding of the framework undermine the conclusions of such work elsewhere in particle and condensed matter physics. On the theoretical side, it is too permissive to allow the construction of models of new physics that fit within the EFT framework but reject naturalness. Parameters for new models can simply be tuned to avoid falsification, and there is no guiding principle for when this tuning should be allowed and when it shouldn’t. On the experimental side, a rejection of naturalness causes less concern, but it might lead to pessimism regarding the prospects of finding new physics. Since the way out of the cosmological constant and hierarchy problems is to set bare parameters to unnaturally small values, we are essentially stating that new physics is far more heavily suppressed than we should expect on naturalness grounds. Since we have no reasonable expectation of the degree of suppression, there are likewise no reasonable expectations regarding energy scales at which we should expect to see new physics. Again, option 3 has the resources to respond to this concern: it is not that we have allowed unnatural fine-tuning, but only that we have given undue significance to the high-energy parameters in a given Lagrangian. However, similar issues arise when we take option 3 as a starting point for going beyond the SMEFT. The framework no longer provides connections to anything beyond the Standard Model. Essentially, in the search for new physics, option 3 becomes indistinguishable from option 4.

Option 4 is the most conceptually radical, but strikes me as the most promising way forward for constructing new theories beyond the Standard Model. The lesson from the cosmological constant and hierarchy problems is that the EFT framework fails to get the scales for new physics correct, and that the problem lies in the relevant parameters in the SMEFT. The failure of model-building approaches under option 1 should suggest that this is a persistent problem with our understanding of the relationship between the Standard Model and its successor theory. Since the EFT framework works elsewhere for intertheoretic relationships between EFTs, the failure here suggests that the successor theory does not fit within the EFT framework. One advantage of this approach is that it does not force a revision to our understanding of other successful applications of EFTs and naturalness. This means that one can retain the understanding of EFT-based intertheoretic relations at energy scales below the SMEFT. In this sense, option 4 is conservative of the current interpretation of the EFT framework; the claim is that despite its successes, it breaks down beyond the Standard Model. Another advantage is that it fits with the expectations that quantum gravity will require moving beyond the EFT framework (Bianchi and Rovelli Reference Bianchi and Rovelli2010; Freidel et al. Reference Freidel and Minic2023; Koberinski and Smeenk Reference Koberinski and Smeenk2023). The EFT framework is a framework for local physics that depends on having sufficient spacetime symmetry structure, a well-defined notion of energy, and scale separation. Various proposed quantum gravity models violate one or more of these assumptions, such that the EFT framework will only be able to capture special limiting cases of the theory. However, the argument here is distinct from generic expectations that quantum gravity will fall outside the scope of EFT. For the latter, it is possible that there are several more successor EFTs between the energy scales relevant for the Standard Model and those for quantum gravity. The lesson I am arguing for here is that the anomalies with relevant parameters suggests that whatever theory immediately succeeds the Standard Model will not be an EFT.Footnote 12

What about arguments for the essential role of naturalness in physics? Those arguing for its essential role for EFTs, such as Williams (Reference Williams2015), start within the EFT framework as given; insofar as one rejects the applicability of the EFT framework, such arguments no longer apply. However, there are more general arguments for naturalness that do not assume the EFT framework. Wallace (Reference Wallace2019), for instance, argues that some version of a naturalness criterion is ubiquitous and essential to the practice of physics. Wallace’s version of a naturalness criterion is the requirement that inputs to a theory not be extremely fine-tuned. At this level of generality, the criterion seems plausibly applicable to many domains outside of the EFT context. Presumably, if such a criterion is truly essential to physics, a version of it will be found to hold in whatever new formalism for beyond Standard Model physics we arrive at.

The biggest drawback to option 4 is that it is highly unconstrained. Rejection of the EFT framework for understanding new physics leaves open a very wide range of possibilities, with a nearly endless range of possible low-energy empirical signatures. One promising avenue forward is to consider reformulations of current QFTs into a different formalism from EFT. These alternative formulations might provide hints as to the way forward. For example, Hollands and Wald (Reference Hollands and Wald2023) take a position-space representation of QFT as their starting point, and build theories using the operator-product expansion near coincident points as a foundation. This is a more explicitly local formulation of QFT, not relying on Fourier transformations and the accompanying symmetries in the background spacetime. Working in position space, the energy-centric EFT framework does not straightforwardly apply. However, this is only one example; several other avenues could fruitfully be pursued. What will ultimately decide the best way forward is an approach that makes novel, successfuly empirical predictions. Thus, I argue that a two-pronged attack is needed to find the path forward. For constructing models of new physics, choose option 4; the current crisis in particle physics demands fresh new ideas. But in order to search for new physical effects, we need to take the current framework very seriously and hold onto option 2 or 3. The latter will provide a clear, structured framework within which to organize searches for new physical effects.

4. Closing the loop and conceptual shifts

While much of the focus on theory change in physics is centered on the conceptual and theoretical changes between an old and a new theory, the major driving force for these conceptual shifts has historically been anomalous empirical results. The interesting difference for the Standard Model anomalies is that they are in some sense purely theoretical; the Standard Model can successfully accommodate the supposedly anomalous results through the usual avenue of allowing couplings in an EFT to be fixed empirically. This means that theory construction is highly unconstrained, as no new empirical anomalies need to be accounted for.

Smith (Reference Smith2010; Reference Smith, Biener and Schliesser2014) makes a convincing case that discrepancies between theory and observation actually serve a stronger positive epistemic function than straightforward theory falsification. Discrepancies serve to highlight new physical effects that may have been omitted in the previous model of the system, and serve as signals that increased precision of measurement reveals sub-leading physical effects requiring more detailed models. The new physical effects are treated as real when they are found to contribute to otherwise independent observables in the theory, and this is one way the theory guides discovery of new entities, forces, or other physical dependencies. When these fit within the conceptual framework provided by the theory, we get a richer picture of the physics within this domain, and close the epistemic loop between theoretical modelling and precision measurement. Such highly constrained relationships between theoretical entities and observables are often stable under theory change as well, at least when qualified to hold to a certain degree of precision within the bounds where the old theory is a good approximation. When the discrepancies persist and cannot be accounted for within the current theoretical framework, this is a strong signal that new physics is required. But it is only by taking the current framework seriously, and exhausting all possibilities to close the loop, that we have true empirical anomalies and good evidence that a new framework is required. In order to construct a new framework, we must therefore push the current framework to its empirical breaking point.

It is this latter stage that is missing from the crisis facing the Standard Model. Despite the problems in our theoretical understanding of scaling, there are no major persistent empirical anomalies facing the Standard Model.Footnote 13 So what is the way forward? So far we have found breaking points only on the side of theory, and only when we try to apply the current framework to extrapolate relations between the Standard Model and some as-yet-unknown future physics. Direct detection of new particles seems unlikely given the current lack of evidence, so we must turn attention to different avenues of testing. In particular, (relatively) low-energy precision tests of the Standard Model offer a promising way forward and a complementary perspective to the rejection of the EFT framework suggested in section 3. This division of labor seems like the optimal path forward for constructing new theories in particle physics.

Luckily, such a strategy is already being pursued. Precision tests of the Standard Model have been going on since the birth of quantum electrodynamics (QED). Koberinski and Smeenk (Reference Koberinski and Smeenk2020) treat the precision testing of the anomalous magnetic moment of the electron in some detail, and argue that, at current levels of precision, “pure QED” alone is insufficient to capture all the relevant physics. Strong and weak virtual effects on the electron’s self-energy make a significant contribution to the anomalous magnetic moment, and without factoring these in, there would be a persistent empirical discrepancy between the predictions of QED and the measured value. Koberinski (Reference Koberinski2023) extends this line of reasoning to current searches for anomalies with the renormalizable sector of the Standard Model. The idea is that the EFT framework can be thought of as a generalization of the old QFT framework, which drops the requirement of renormalizability. This creates a well-defined theory space, parameterized by the values of each possible coupling constant for a given EFT; trajectories through this space describe a theory’s scaling behavior.

Even in cases where the intertheoretic relations seem to break down, the EFT framework can still be useful for organizing and parameterizing deviations from the renormalizable Standard Model. By adopting option 2 or 3 outlined above, we can reject the use of the EFT framework as providing reliable estimates of the domain of applicability of the SMEFT via the relevant parameters. Instead, we can focus on finding evidence for the effects of irrelevant parameters. The EFT expansion and ordering of new terms by mass dimension provides a systematic way to sort the relative importance of possible new terms, each of which impacts several observables in the theory. This structure of new terms does not depend directly on the implied relationship to the Standard Model’s successor, and can be treated as a standalone framework for an individual EFT.

The biggest hurdle to so treating the SMEFT is the enormous number of possible terms contributing at next-to-leading order. There are 2499 possible terms at first nonrenormalizable order for the SMEFT, making a systematic exploration for any small number of new terms prohibitively impractical (Bechtle et al. Reference Bechtle, Chall, King, Krämer, Mättig and Stöltzner2022). This number of free parameters is greatly reduced, however, when one considers the structural relations inherent in an EFT and imposes consistency constraints from experimentally known physics in the Standard Model. This allows one to reduce the number of free parameters, and to promote various experimental sectors as the most promising avenues for finding anomalies (de Blas et al. Reference de Blas, Du, Grojean, Gu, Miralles, Peskin, Tian, Vos and Vryonidou2022). In effect, one targets searches for new physics by focusing on areas where there is some tension between measured and predicted values of observables, but where that tension is within the current margins of experimental error. The SMEFT then provides additional theoretical reason to expect that some of these tensions might hold in new physics, in particular the ones with no global constraints forcing them to take a particular value. These sectors of the SMEFT can then be chosen as the target of future colliders, or of precision measurement design using smaller tabletop devices. Currently, the most promising avenues for precision collider experiments are to target the weak sector (notably precision measurements of W- and Z-boson phenomena), the Higgs, and top and bottom quark physics (de Blas et al. Reference de Blas, Du, Grojean, Gu, Miralles, Peskin, Tian, Vos and Vryonidou2022). In all these cases, there is some hint of tension between the current state-of-the-art predicted and measured parameter values, but the current lack of experimental precision makes these tensions statistically insignificant. Thus, designing more precise tests of these parameter values is a top priority. For low-energy precision tests, the anomalous magnetic moment of the muon has also been the focus of attention recently, though again the tension here has not reached the threshold of a new discovery, and subsequent work has started to dissolve the apparent tension (Koberinski Reference Koberinski2022).

This suggests a division of labor in attitudes towards the SMEFT and the EFT framework more generally. While those pursuing the project of theory construction beyond the Standard Model should reject the EFT framework in constructing new candidate theories, those looking for persistent empirical anomalies should take the framework as seriously as possible. Theorists pursuing this avenue should strive to make as detailed predictions as possible, while experimentalists should use the EFT framework to target sectors of the Standard Model. The role of precision measurement is to reveal which tensions are merely due to a lack of detail, and which are actually persistent anomalies that must be explained by new theories. Take, for example, the quantum revolution. There were persistent theoretical anomalies with classical physics, such as the stability of the atom and the electron’s self-interaction. But these theoretical anomalies were not enough on their own to suggest the way forward; anomalous experimental results flooded in, ushering in the quantum revolution. Despite the recognition of the breakdown of EFT methods for relevant parameters in the SMEFT, we do not yet have the accompanying empirical anomalies. The only way to generate them is to predicate precision tests on the truth of the very framework we know is flawed.

5. Conclusions

The EFT framework has played an essential role in the foundations of twentieth-century physics for several reasons. Born out of the necessity of restricting the energy scales at which a QFT could be defined, it led to a new way of conceptualizing the scope of theories, and for quantifying intertheoretic relationships. Many of these insights are valuable to general issues in the philosophy of science, and do not depend on the EFT framework being applicable to future theories. However, I have argued that the two major theoretical anomalies in the SMEFT—the hierarchy problem and the cosmological constant problem—pose real problems for the applicability of the EFT framework beyond the Standard Model. In particular, I argued that they both reflect a breakdown in how we understand the role of relevant parameters as providing insight into the breakdown scale of an EFT. I argued that the lesson we should take from this breakdown is that the EFT framework cannot be used to understand the relationship between the Standard Model and its (as yet unknown) successor theory. Theory construction beyond the Standard Model should consciously seek to break assumptions necessary for the EFT framework. This conclusion impacts local arguments supporting a realism for the Standard Model based on the EFT framework. Since we cannot trust the framework to provide prospective information on how the Standard Model relates to its successor, local forms of effective realism seem to face a barrier. However, unlike the options that involve modifying the EFT framework to address the theoretical anomalies with the SMEFT, this move preserves the insight gained regarding intertheoretic relations wherever else the EFT framework still applies, including lower-energy particle physics and condensed matter physics.

I have also argued that the theoretical anomalies for the Standard Model are not enough to point the way forward. The history of science tells us that major conceptual changes require empirical anomalies for guidance; the lack of evidence for new physics at the LHC appears to make the prospects for empirical anomalies dim. However, the way forward is to continue to stress test the Standard Model via precision testing. This sort of testing requires one to assume the validity of the EFT framework, in order to structure searches for minute discrepancies between theory and experiment. The persistent discrepancies will provide much-needed empirical hints to serve as touchstones for theory construction beyond the Standard Model. The community of particle physicists must therefore have a division of labor; theorists constructing new models should consciously break out of the EFT framework, while others work out its consequences for the Standard Model in increasing detail.

Acknowledgments

I am grateful to Doreen Fraser, Chris Smeenk, Eleanor Knox, and Yichen Luo for helpful discussions related to this paper, and to the audience at the Quantum Spacetime in the Cosmos workshop at the Perimeter Institute for Theoretical Physics for feedback on an early version of this talk. I am also grateful for the feedback from two anonymous reviewers.

Footnotes

1 Here I follow standard usage to talk of the renormalization group, though what falls under this category is better understood to be a family of methods for understanding scaling behavior. See Fraser (Reference Fraser2021) and Koberinski and Fraser (Reference Koberinski and Fraser2023) for further discussion of these points.

2 In the literature on EFTs, the distinction between top-down and bottom-up is in reference to an energy scale, and is therefore the opposite of the usual metaphor for fundamental theories in philosophy of science. The higher-energy theories are at the top of the scale, and these are usually taken to be more fundamental than the low-energy counterparts.

3 Particle physicists work in units where $$\hbar = c = 1$$ , such that mass/energy is the only meaningful dimensionful quantity, and length has dimension of ${[{\rm{mass}}]^{ - 1}}$ .

4 In practice, the scaling behavior of couplings is determined perturbatively, and terms that may be marginal to one order of perturbation theory will usually end up being marginally (ir)relevant. They still (shrink) grow under renormalization group flow, but do so much slower than the fully (ir)relevant couplings.

5 As we will soon see, the major theoretical anomalies for the Standard Model involve the scaling behavior of relevant parameters in the SMEFT.

6 There is no sharp demarcation for when a quantity is “natural.” What counts as much greater or much less than 1 is context dependent. However, it is generally granted that naturalness implies dimensionless quantities should be within a few orders of magnitude of 1.

7 There have been, and continue to be, many transient anomalies that seem to hint at the breakdown of the renormalizable Standard Model. There was the well-publicized 750 GeV event reported by ATLAS and CMS that was later reported to be a statistical fluctuation, and more recently the muon anomalous magnetic moment, the W anomaly, and the anomalies in b quark decays (D’Alise et al. Reference D’Alise2022). None of these anomalies have crossed the $5\sigma $ threshold to count as a new discovery, and subsequent testing has thus far reduced the statistical strength of tension in favor of the Standard Model.

8 Manohar’s argument is somewhat stronger than this. He argues that there is no sense in which the high-energy Lagrangian parameters have any meaning. Rosaler and Harlander (Reference Rosaler and Harlander2019) put forth a similar argument. While I am sympathetic to this understanding of the renormalization group, it is certainly not the consensus understanding of renormalization group scaling for EFTs. My goal in this paper is to highlight issues with the EFT framework as understood and used by the majority of physicists.

9 This, of course, implies that the successor theory is such that it fits within the EFT framework. As I will suggest in section 3, it is this very assumption that we should drop in light of the hierarchy and cosmological constant problems.

10 For reasons of space, this summary of the cosmological constant problem is brief. See Koberinski (Reference Koberinski, Wüthrich, Le Bihan and Huggett2021a,Reference Koberinskib), Rugh and Zinkernagel (Reference Rugh and Zinkernagel2002), Schneider (Reference Schneider2020), and Wallace (Reference Wallace2022) for critical philosophical discussions of the problem.

11 I am eliding many details and subtleties here for the purpose of a clear exposition. The cosmological constant is obviously not directly observable, but is tied to observable quantities via the theory of general relativity in the ${\rm{\Lambda }}$ CDM model of cosmology. This “indirect” connection is potentially more tenuous than other connections between theoretical quantities and observables in physics, but the difference is only one of degree. We similarly do not directly observe the mass of the Higgs boson, for example, but infer it via a complicated system of inferences from theory to simulation of events to a model of the apparatus (Karaca Reference Karaca2018; Staley Reference Staley2020). The salient difference is the connection between different theories in order to connect $\langle \rho \rangle $ to the cosmological constant.

12 In some sense, we already have an EFT treatment of physics that goes strictly beyond the Standard Model. What Wallace (Reference Wallace2022) calls “low-energy quantum gravity” treats Standard Model matter and gravity together as an EFT. In certain regimes, this treatment works extremely well, but there are general reasons to expect that it can only work as a special limiting case (Koberinski and Smeenk Reference Koberinski and Smeenk2023). In any case, this is not a beyond Standard Model theory in the sense meant here, as the relevant energy scales at which this EFT is valid overlap with those of the Standard Model.

13 There are some exceptions, including the apparent mass of neutrinos and dark matter, but these seem to be less of a focus for theory construction thus far. Responses to these potential empirical anomalies have largely been phenomenological.

References

Atlas Collaboration. 2012. “A Particle Consistent with the Higgs Boson Observed with the ATLAS Detector at the Large Hadron Collider”. Science 338 (6114):1576–82. https://doi.org/10.1126/science.123200510.1126/science.1232005 Google Scholar
Bechtle, Philip, Chall, Cristin, King, Martin, Krämer, Michael, Mättig, Peter, and Stöltzner, Michael. 2022. “Bottoms Up: The Standard Model Effective Field Theory from a Model Perspective”. Studies in History and Philosophy of Science 92:129–43. https://doi.org/10.1016/j.shpsa.2022.01.01410.1016/j.shpsa.2022.01.014 Google Scholar
Bianchi, Eugenio and Rovelli, Carlo. 2010.“Why All These Prejudices Against a Constant?” arXiv preprint. https://doi.org/10.48550/arXiv.1002.396610.48550/arXiv.1002.3966 Google Scholar
Burgess, Cliff P. 2004. “Quantum Gravity in Everyday Life: General Relativity as an Effective Field Theory”. Living Reviews in Relativity 7 (1):5. https://doi.org/10.12942/lrr-2004-510.12942/lrr-2004-5 Google Scholar
Butterfield, Jeremy and Bouatta, Nazim. 2015. “Renormalization for Philosophers”. In Metaphysics in Contemporary Physics, edited by Bigaj, Tomasz and Wüthrich, Christian, 437–85. Leiden: Brill.Google Scholar
CMS Collaboration. 2012. “A New Boson with a Mass of 125 GeV Observed with the CMS Experiment at the Large Hadron Collider”. Science 338 (6114):1569–75. https://doi.org/10.1126/science.123081610.1126/science.1230816 Google Scholar
D’Alise, Alessandra et al. 2022. “Standard Model Anomalies: Lepton Flavour Non-Universality, g-2 and W-Mass”. Journal of High Energy Physics 2022 (8):153. https://doi.org/10.1007/JHEP08(2022)12510.1007/JHEP08(2022)125 Google Scholar
de Blas, Jorge, Du, Yong, Grojean, Christophe, Gu, Jiayin, Miralles, Victor, Peskin, Michael E., Tian, Junping, Vos, Marcel, and Vryonidou, Eleni. 2022. “Global SMEFT Fits at Future Colliders”. arXiv preprint. https://doi.org/10.48550/arXiv.2206.0832610.48550/arXiv.2206.08326 Google Scholar
Dizadji-Bahmani, Foad. 2021. “Nagelian Reduction in Physics”. In The Routledge Companion to Philosophy of Physics, edited by Knox, Eleanor and Wilson, Alistair, 499511. London: Routledge.Google Scholar
Donoghue, John F. 2012. “The Effective Field Theory Treatment of Quantum Gravity”. AIP Conference Proceedings 1483 (1):7394. https://doi.org/10.1063/1.475696410.1063/1.4756964 Google Scholar
Dougherty, John. 2023. “Effective and Selective Realisms”. The British Journal for the Philosophy of Science. https://doi.org/10.1086/72497810.1086/724978 Google Scholar
Franklin, Alexander. 2020. “Whence the Effectiveness of Effective Field Theories?The British Journal for the Philosophy of Science 71 (4):1235–59. https://doi.org/10.1093/bjps/axy05010.1093/bjps/axy050 Google Scholar
Fraser, James D. 2020. “Toward a Realist View of Quantum Field Theory”. In Scientific Realism and the Quantum, edited by French, Steven and Saatsi, Juha. Oxford: Oxford University Press.Google Scholar
Fraser, James D. 2021. “The Twin Origins of Renormalization Group Concepts”. Studies in History and Philosophy of Science Part A 89:114–28. https://doi.org/10.1016/j.shpsa.2021.08.00210.1016/j.shpsa.2021.08.002 Google Scholar
Freidel, Laurent, Jerzy Kowalski-Glikman, Robert G Leigh, and Minic, Djordje. 2023. “On the Inevitable Lightness of Vacuum”. International Journal of Modern Physics D 32:14. https://doi.org/10.1142/S021827182342004X10.1142/S021827182342004X Google Scholar
Giudice, Gian Francesco. 2008. “Naturally Speaking: The Naturalness Criterion and Physics at the LHC”. In Perspectives on LHC Physics, edited by Kane, Gordon and Pierce, Aaron, 155–78. Hackensack, NJ: World Scientific. https://doi.org/10.1142/668610.1142/6686 Google Scholar
Giudice, Gian Francesco. 2017. “The Dawn of the Post-Naturalness Era”. arXiv preprint. https://doi.org/10.48550/arXiv.1710.0766310.48550/arXiv.1710.07663 Google Scholar
Hobson, Michael P., Efstathiou, George P., and Lasenby, Anthony N.. 2006. General Relativity: An Introduction for Physicists. Cambridge: Cambridge University Press.Google Scholar
Hollands, Stefan and Wald, Robert M.. 2023. “The Operator Product Expansion in Quantum Field Theory”. arXiv preprint. https://doi.org/10.48550/arXiv.2312.0109610.48550/arXiv.2312.01096 Google Scholar
Karaca, Koray. 2018. “Lessons from the Large Hadron Collider for Model-Based Experimentation: The Concept of a Model of Data Acquisition and the Scope of the Hierarchy of Models”. Synthese 195:5431–52. https://doi.org/10.1007/s11229-017-1453-510.1007/s11229-017-1453-5 Google Scholar
Knox, Eleanor and Wallace, David. 2023. “Functionalism Fit for Physics”. Available at: https://philsci-archive.pitt.edu/22631/.Google Scholar
Koberinski, Adam. 2021a. “Problems with the Cosmological Constant Problem”. In Philosophy Beyond Spacetime, edited by Wüthrich, Christian, Le Bihan, Baptiste, and Huggett, Nick. Oxford: Oxford University Press.Google Scholar
Koberinski, Adam. 2021b. “Regularizing (Away) Vacuum Energy”. Foundations of Physics 51 (1):20. https://doi.org/10.1007/s10701-021-00442-z10.1007/s10701-021-00442-z Google Scholar
Koberinski, Adam. 2022. “‘Fundamental’ ‘Constants’ and Precision Tests of the Standard Model”. Philosophy of Science 89 (5):1255–64. https://doi.org/10.1017/psa.2022.4110.1017/psa.2022.41 Google Scholar
Koberinski, Adam. 2023. “Generalized Frameworks: Structuring Searches for New Physics”. European Journal for Philosophy of Science 13 (1):3. https://doi.org/10.1007/s13194-022-00504-710.1007/s13194-022-00504-7 Google Scholar
Koberinski, Adam and Fraser, Doreen. 2023. “Renormalization group methods and the epistemology of effective field theories”. Studies in History and Philosophy of Science 98:1428. https://doi.org/10.1016/j.shpsa.2023.01.003 Google Scholar
Koberinski, Adam and Smeenk, Chris. 2020. “Q.E.D., QED”. Studies in the History and Philosophy of Modern Physics 71:113. https://doi.org/10.1016/j.shpsb.2020.03.00310.1016/j.shpsb.2020.03.003 Google Scholar
Koberinski, Adam and Smeenk, Chris. 2023. “Λ and the Limits of Effective Field Theory”. Philosophy of Science 90 (2):454–74. https://doi.org/10.1017/psa.2022.1610.1017/psa.2022.16 Google Scholar
Manohar, Aneesh V. 2020. “Introduction to Effective Field Theories”. In Effective Field Theory in Particle Physics and Cosmology: Lecture Notes of the Les Houches Summer School: Volume 108, July 2017, 47. Oxford: Oxford University Press.Google Scholar
McKenzie, Kerry. 2019. “A Curse on Both Houses: Naturalistic Versus A Priori Metaphysics and the Problem of Progress”. Res Philosophica 97 (1):129. https://doi.org/10.11612/resphil.186810.11612/resphil.1868 Google Scholar
Miller, Michael E. 2021. “Worldly Imprecision”. Philosophical Studies 178 (9):2895–911. https://doi.org/10.1007/s11098-020-01591-z10.1007/s11098-020-01591-z Google Scholar
Peskin, Michael E. and Schroeder, Daniel V.. 1995. An Introduction to Quantum Field Theory. Boca Raton, FL: CRC Press.Google Scholar
Rivat, Sébastien. 2021. “Effective Theories and Infinite Idealizations: A Challenge for Scientific Realism”. Synthese 198 (12):12107–36. https://doi.org/10.1007/s11229-020-02852-410.1007/s11229-020-02852-4 Google Scholar
Rivat, Sébastien and Grinbaum, Alexei. 2020. “Philosophical Foundations of Effective Field Theories”. The European Physical Journal A 56:90. https://doi.org/10.1140/epja/s10050-020-00089-w10.1140/epja/s10050-020-00089-w Google Scholar
Rosaler, Joshua and Harlander, Robert. 2019. “Naturalness, Wilsonian Renormalization, and ‘Fundamental Parameters’ in Quantum Field Theory”. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 66:118–34. https://doi.org/10.1016/j.shpsb.2018.12.00310.1016/j.shpsb.2018.12.003 Google Scholar
Ruetsche, Laura. 2018. “Renormalization Group Realism: The Ascent of Pessimism”. Philosophy of Science 85 (5):1176–89. https://doi.org/10.1086/69971910.1086/699719 Google Scholar
Ruetsche, Laura. 2020. “Perturbing Realism”. In Scientific Realism and the Quantum, edited by French, Steven and Saatsi, Juha. Oxford: Oxford University Press.Google Scholar
Rugh, Svend E. and Zinkernagel, Henrik. 2002. “The Quantum Vacuum and the Cosmological Constant Problem”. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics 33 (4):663705. https://doi.org/10.1016/S1355-2198(02)00033-310.1016/S1355-2198(02)00033-3 Google Scholar
Schneider, Mike D. 2020. “What’s the Problem with the Cosmological Constant?Philosophy of Science 87 (1):120. https://doi.org/10.1086/70607610.1086/706076 Google Scholar
Smith, George E. 2010. “Revisiting accepted science: The indispensability of the history of science”. The Monist 93 (4):545579. https://doi.org/10.5840/monist201093432 Google Scholar
Smith, George E. 2014. “Closing the Loop”. In Newton and Empiricism, edited by Biener, Zvi and Schliesser, Eric, 262352. New York: Oxford University Press.Google Scholar
Staley, Kent W. 2020. “Securing the Empirical Value of Measurement Results”. The British Journal for the Philosophy of Science 71 (1):87113. https://doi.org/10.1093/bjps/axx03610.1093/bjps/axx036 Google Scholar
Wallace, David. 2019. “Naturalness and Emergence”. The Monist 102 (4):499524. https://doi.org/10.1093/monist/onz02210.1093/monist/onz022 Google Scholar
Wallace, David. 2021. “The Quantum Theory of Fields”. In The Routledge Companion to Philosophy of Physics, edited by Knox, Eleanor and Wilson, Alistair, 275–95. London: Routledge.Google Scholar
Wallace, David. 2022. “Quantum Gravity at Low Energies”. Studies in History and Philosophy of Science 94:3146. https://doi.org/10.1016/j.shpsa.2022.04.00310.1016/j.shpsa.2022.04.003 Google Scholar
Williams, Porter. 2015. “Naturalness, the Autonomy of Scales, and the 125 GeV Higgs”. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 51:8296. https://doi.org/10.1016/j.shpsb.2015.05.00310.1016/j.shpsb.2015.05.003 Google Scholar
Williams, Porter. 2019. “Scientific Realism Made Effective”. The British Journal for the Philosophy of Science 70 (1):209–37. https://doi.org/10.1093/bjps/axx04310.1093/bjps/axx043 Google Scholar
Williams, Porter. 2021. “Renormalization Group Methods”. In The Routledge Companion to Philosophy of Physics, edited by Knox, Eleanor and Wilson, Alistair, 296310. London: Routledge.Google Scholar