We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Some ten years ago, when completing with J.-B. Zuber a previous text on Quantum Field Theory, the senior author was painfully aware that little mention was made that methods in statistical physics and Euclidean field theory were coming closer and closer, with common tools based on the use of path integrals and the renormalization group giving insights on global structures. It was partly to fill this gap that the present book was undertaken. Alas, over the five years that it took to come to life, both subjects have undergone a new evolution. Disordered media, growth patterns, complex dynamical systems or spin glasses are among the new important topics in statistical mechanics, while superstring theory has turned to the study of extended systems, Kaluza–Klein theories in higher dimensions, anticommuting coordinates … in an attempt to formulate a unified model including all known interactions. New and sophisticated techniques have invaded statistical physics, ranging from algebraic methods in integrable systems to fractal sets or random surfaces. Powerful computers or special devices provide “experimental” means for a new brand of theoretical physicists. In quantum field theory, applications of differential topology, geometry, Riemannian manifolds, operator theory … require a deeper background in mathematics and a knowledge of some of its most recent developments. As a result, when surveying what has been included in the present volume in an attempt to uncover the basic unity of these subjects, the authors have the same unsatisfactory feeling of not being able to bring the reader really up to date.
A new field opened when modern computers offered the possibility of performing extensive simulations of large systems. This allows known behaviours to be checked and provides an exploratory guide in circumstances where theoretical tools are absent. Measurements of observables can be compared both to existing theoretical expectations – providing a crucial test for their validity – and to experiments – checking the modelling of a physical system –. This chapter presents the background material needed to design simulations on a (relatively) large scale. Some numerical examples have already been presented in previous chapters, and we only give a few further illustrations, pertaining mainly to lattice gauge theory, the usefulness of which relies extensively on this method. We also describe a practical implementation of real space renormalization, known as the Monte Carlo Renormalization Group method (Ma, Swendsen, Wilson). Finally, we discuss specific issues relevant to the extension of the simulations to fermionic systems.
Algorithms
Generalities
Systems with up to 106 to 107 variables can be handled by computers, and these numbers may soon be significantly increased. The measurements can be sufficiently numerous to allow statistical accuracy. Although simulated systems still have a modest size as compared to macroscopic systems, collective effects already clearly appear and accurate results about critical phenomena emerge from the numerical simulations. It turns out, when investigating more closely the available numerical methods, that one gets a better insight into the foundations of equilibrium statistical physics, the ergodicity problems, the meaning of probabilities and, last but not least, ways to approach equilibrium.
This chapter is devoted to technicalities related to various expansions already encountered in volume 1, mostly those that derive from the original lattice formulation of the models, be it high or low temperature, strong coupling expansions and to some extent those arising in the guise of Feynman diagrams in the continuous framework. We shall not try to be exhaustive, but rather illustrative, relying on the reader's interest to investigate in greater depth some aspects inadequately treated. Nor shall we try to explore with great sophistication the vast domain of graph theory. There are, however, a number of common features, mostly of topological nature, which we would like to present as examples of the diversity of what looks at first sight like straightforward procedures.
General Techniques
Definitions and notations
Let a labelled graph G be a collection of v elements from a set of indices, and l pairs of such elements with possible duplications (i.e. multiple links). We shall also interchangeably use the word diagram instead of graph. This abstract object is represented by v points (vertices) and l links. Each vertex is labelled by its index.
The problem under consideration will define a set of admissible graphs, with a corresponding weight ω(G) (a real or complex number) according to a well-defined set of rules. We wish to find the sum of weights over all admissible graphs.
Up to now, continuous field theory has appeared as a tool in the study of critical phenomena. Conversely, techniques from statistical mechanics can be useful in field theory. In 1973, Wilson proposed a lattice analog of the Yang–Mills gauge model. Its major aim was to explain the confinement of quarks in quantum chromodynamics. The lattice implementation of a local symmetry yields a transparent geometric interpretation of the gauge potential degrees of freedom, the latter being replaced by group elements assigned to links. Strong coupling expansions predict a linearly rising potential energy between static sources. Complex phase diagrams emerge when gauge fields are coupled to matter fields, and new phenomena appear, such as the absence of local order parameters. The discretization of fermions leads also to interesting relations with topology. This chapter is devoted to the theoretical developments of these ideas.
Generalities
Presentation
Schematic as they are, statistical models have directly a physical background at any temperature. Lattices may represent the crystalline structure of solids. They play an important role at short distance, but become irrelevant in the critical region, except as a regulator for the field theory describing the approach to critical points. The opposite point of view can also be considered. A lattice is artificially introduced as a regulator for a continuous field theory. The lattice system has no physical meaning, but can be studied at any “temperature”, so that one can get information about its critical region, hopefully described by the initial continuous theory.
It may seem surprising to start our study with a description of Brownian motion. However, this offers an interesting introduction to the concept of Euclidean quantum field, and an intuitive understanding of the role of dimensionality. The effective (or Hausdorff) dimension two of Brownian curves is particularly significant. It means that two such curves fail to intersect, hence to interact, in dimension higher than four. This is illustrated in the first section of this chapter, which also discusses the transition from a discrete to a continuous walk. A similar analysis for interacting fields, pioneered by K. Symanzik, is presented in the second section. It is related to strong coupling, or high temperature, expansions, to be studied later, in particular in chapter 6 of this volume and chapter 7 of volume 2. The introduction of n-component fields provides the means to incorporate “self-avoiding” walks in the limit n → 0. We conclude this chapter with an analysis of elementary one-dimensional systems. This enables us to introduce the useful concept of transfer matrix.
Brownian motion
Random walks
We begin with a discussion of random walks on a regular, infinite lattice in d-dimensional Euclidean space. Each site has q neighbours, where q is called the coordination number of the lattice. At regular time intervals, separated by an amount Δt = 1, a walker jumps from one site towards a neighbouring one, chosen at random.
Condensed media show a large variety of critical phenomena, ranging from critical opalescence at the end point of the liquid–gas coexistence curve, the Curie transition of ferromagnetic materials, to the superfluid transition of helium, the behaviour of solutions of polymers, the conductivity of random media, … To this list should be added systems with local symmetries, such as those suggested by particle physics in order to understand quark confinement. Surprisingly all these phenomena can be classified into a few universality classes, characterized by specific large distance behaviour with the same critical exponents. In this chapter, we sketch the methods and ideas of the renormalization procedure. We illustrate the concepts with simple approximations in the language of classical spin models in the first section, treating in more detail the XY-model as an example in the second section.
Scaling laws. Real space renormalization
Homogeneity and scale invariance
The discussion given in previous chapters suggests that, close to a continuous transition, critical systems exhibit universal properties. Correlations at large distance are not sensitive to the details of microscopic interactions. Their behaviour is described by a specific dimensional analysis governed by some essential characteristics of the system, such as the dimension of space, the nature of the order parameter and the underlying symmetries.
The mean field approximation gave a first idea of a simple critical behaviour. Fluctuations have only a quantitative effect in dimension greater than the upper critical one.
Some ten years ago, when completing with J.-B. Zuber a previous text on Quantum Field Theory, the senior author was painfully aware that little mention was made that methods in statistical physics and Euclidean field theory were coming closer and closer, with common tools based on the use of path integrals and the renormalization group giving insights on global structures. It was partly to fill this gap that the present book was undertaken. Alas, over the five years that it took to come to life, both subjects have undergone a new evolution. Disordered media, growth patterns, complex dynamical systems or spin glasses are among the new important topics in statistical mechanics, while superstring theory has turned to the study of extended systems, Kaluza–Klein theories in higher dimensions, anticommuting coordinates … in an attempt to formulate a unified model including all known interactions. New and sophisticated techniques have invaded statistical physics, ranging from algebraic methods in integrable systems to fractal sets or random surfaces. Powerful computers or special devices provide “experimental” means for a new brand of theoretical physicists. In quantum field theory, applications of differential topology, geometry, Riemannian manifolds, operator theory … require a deeper background in mathematics and a knowledge of some of its most recent developments. As a result, when surveying what has been included in the present volume in an attempt to uncover the basic unity of these subjects, the authors have the same unsatisfactory feeling of not being able to bring the reader really up to date.
Continuous field models were originally systematically elaborated (and still are) in the context of particle physics. Most relevant to the study of critical phenomena is the derivation of a renormalization group flow, characterized by a set of Callan–Symanzik equations for Green functions, with regular coefficients derived from perturbation theory. This is supplemented by the idea due to Fisher and Wilson of an expansion in powers of the deviation from the strict renormalizability dimension, i.e. four in the case of the ϕ4 model, but can also be presumed to work directly in the physical dimension three (and even possibly two). We devote this chapter to a general presentation and a survey of some applications. An appendix gives a short introduction to multicritical phenomena.
The Lagrangian and dimensional analysis
Introduction
We want to investigate universal critical properties. A discrete lattice model appears as a regularizing intermediate stage, which allows a precise meaning to be given to the functional integrals of the field theory. Following the scheme suggested by the mean field approximation, one is tempted to start directly from a continuous model, in which the lattice is replaced by a d-dimensional continuous Euclidean space, and also, even for models with a discrete symmetry, the dynamical variables are replaced by continuous fields. From the original formulation, we just retain the idea of a cutoff factor ∧ large with respect to all momenta, which ensures that some possibly divergent expressions remain finite.
Statistical models with a geometrical basis arise in many circumstances, such as the theory of liquids, membranes, polymer networks, defects, microemulsions, interfaces, etc. Gauge theories also lead to random surfaces, as does the theory of extended objects such as strings, and quantum gravity requires a generalization to four-manifolds. From a general point of view, local quantum field theory is rooted in the study of random paths. One may wish to find such a universal model, generalizing Brownian curves, to Brownian manifolds and in the first instance to surfaces (Polyakov, 1981). Despite many efforts, no such universal archetype has been found, although the endeavour towards such a model has uncovered a rich mathematical structure. By necessity, our presentation will be limited to the most elementary aspects.
In the first section we discuss random lattices in Euclidean space. The use of such lattices was advocated as a mean to restore translational invariance while keeping the advantage of a short distance cutoff (Christ, Friedberg and Lee, 1982). The formalism could be generalized to other manifolds, but we refrain from doing to, nor will we pursue the analysis of standard models on random lattices, a difficult subject. Even free field theory on such a lattice opens the Pandora's box of disordered systems. Concepts from random lattices may be useful in the study of liquids or gases.
Real systems may involve various types of defects. It is therefore important to assess their relevance and to estimate their effect on the results established in pure cases. The emphasis here is on stability and robustness versus random perturbations. We present in the last section of this chapter the Harris criterion for estimating the effect of weak disorder on a critical system. At a more fundamental level, one can also look for new phenomena originating in the very existence of defects. For instance, the dynamics of dislocations in a crystal can explain the solid-liquid transition. Similarly, we were led to analyze the role of vortices when studying the XY-model. Random impurity potentials produce the localization of quantum wavefunctions (Anderson, 1958), which enables one to understand the transition between insulators and conductors. This same localization phenomenon appears in other contexts involving classical waves, such as optics or sound propagation. This subject has given rise to an intense activity. One cannot claim that it is fully understood at present, eventhough the weak disordered case seems under control, thanks to renormalization group arguments. The role of magnetic fields has opened a new area of research centered on the quantum Hall effect. Another domain which has stimulated vigorous and original developments is the one of magnetic systems with random and/or phase, where magnetic moments can become frozen in random directions, with a plethora of metastable states, or valleys, in free energy.
It was noted by Polyakov and others in the early seventies that critical models implement a global conformal invariance which goes beyond pure scale invariance. The latter affects relative distances by a constant (i.e. space independent) factor, while other conformal transformations involve a space dependent factor. This invariance property enables one to fix not only the form of two-point but also three-point functions at criticality, when they are nonvanishing. However the conformal group is in general a finite dimensional Lie group, so that the resulting constraints are limited in number. In two dimensions, a new phenomenon arises, well known in the theory of analytic functions, namely there exists a plethora of local conformal transformations. As a result, it was tempting to investigate the possible consequences of local scale invariance in two dimensions. This is what was brilliantly undertaken by Belavin, Polyakov and Zamolodchikov in 1983, launching a new wave of applications in statistical physics. As the subject is still in its development, the present chapter will not be as elementary as previous ones, nor will it presumably remain up to date, especially as it is closely related to string field theory, a promising new approach to the quantum description of extended objects, which attempts to embrace all known interactions including gauge theories and gravity.
In many statistical models (but not all) one can introduce a local order parameter associated with a finite, discrete or continuous group of symmetries. The higher the temperature, the more important the fluctuations. One expects therefore in general to find pure phases at low temperature, with a reduced symmetry group, as a consequence of a nonvanishing expectation value of some order parameter. This situation is referred to as spontaneous symmetry breaking. A typical example is the classical Heisenberg model describing short-range interactions of an n-vector field ϕ, with an orthogonal O(n) symmetry group. For n = 1, 2, 3, this can account for a Curie transition from a ferromagnetic to a paramagnetic phase. In particle physics the σ-model of Gell-Mann and Levy involves a spontaneous symmetry breaking of chiral invariance, typical of massless spinor fields, accompanied by soft excitation modes, the so-called Goldstone modes, associated to a π-meson triplet and leading to a nonvanishing dynamical fermion mass. In a first and rather crude approximation, one can analyse the action itself or an effective one incorporating some fluctuation effects, and look for extrema as a function of field configurations, generally translationally invariant. The remaining fluctuations are then treated perturbatively. This mean field method is common to a great variety of domains, ranging from the Clausius–Mossotti formula for a polarizable medium, the Weiss molecular field in the theory of magnetism, Landau's effective action in various statistical contexts, the effective medium approximation in disordered systems, the Hartree–Fock method in atomic or many-body physics, to the semiclassical approximation in the study of quantum systems.