We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Tangles offer a precise way to identify structure in imprecise data. By grouping qualities that often occur together, they not only reveal clusters of things but also types of their qualities: types of political views, of texts, of health conditions, or of proteins. Tangles offer a new, structural, approach to artificial intelligence that can help us understand, classify, and predict complex phenomena.This has become possible by the recent axiomatization of the mathematical theory of tangles, which has made it applicable far beyond its origin in graph theory: from clustering in data science and machine learning to predicting customer behaviour in economics; from DNA sequencing and drug development to text and image analysis. Such applications are explored here for the first time. Assuming only basic undergraduate mathematics, the theory of tangles and its potential implications are made accessible to scientists, computer scientists, and social scientists.
In creatures ranging from birds to fish to wildebeest, we observe the collective and coherent motion of large numbers of organisms, known as 'flocking.' John Toner, one of the founders of the field of active matter, uses the hydrodynamic theory of flocking to explain why a crowd of people can all walk, but not point, in the same direction. Assuming a basic undergraduate-level understanding of statistical mechanics, the text introduces readers to dry active matter and describes the current status of this rapidly developing field. Through the application of powerful techniques from theoretical condensed matter physics, such as hydrodynamic theories, the gradient expansion, and the renormalization group, readers are given the knowledge and tools to explore and understand this exciting field of research. This book will be valuable to graduate students and researchers in physics, mathematics, and biology with an interest in the hydrodynamic theory of flocking.
Interacting biological systems at all organizational levels display emergent behavior. Modeling these systems is made challenging by the number and variety of biological components and interactions – from molecules in gene regulatory networks to species in ecological networks – and the often-incomplete state of system knowledge, such as the unknown values of kinetic parameters for biochemical reactions. Boolean networks have emerged as a powerful tool for modeling these systems. This Element provides a methodological overview of Boolean network models of biological systems. After a brief introduction, the authors describe the process of building, analyzing, and validating a Boolean model. They then present the use of the model to make predictions about the system's response to perturbations and about how to control its behavior. The Element emphasizes the interplay between structural and dynamical properties of Boolean networks and illustrates them in three case studies from disparate levels of biological organization.
Renormalization group theory of tensor network states provides a powerful tool for studying quantum many-body problems and a new paradigm for understanding entangled structures of complex systems. In recent decades the theory has rapidly evolved into a universal framework and language employed by researchers in fields ranging from condensed matter theory to machine learning. This book presents a pedagogical and comprehensive introduction to this field for the first time. After an introductory survey on the major advances in tensor network algorithms and their applications, it introduces step-by-step the tensor network representations of quantum states and the tensor-network renormalization group methods developed over the past three decades. Basic statistical and condensed matter physics models are used to demonstrate how the tensor network renormalization works. An accessible primer for scientists and engineers, this book would also be ideal as a reference text for a graduate course in this area.
This chapter extends DMRG from real space to an arbitrary basis space in which each basis state, such as a momentum eigenstate or a molecular orbital, serves as an effective lattice site. Unlike in real space, the interaction potentials become nonlocal and off-diagonal in an arbitrary basis representation. To solve this nonlocal problem, one should optimize the order of basis states and introduce the so-called complementary operators to minimize the number of operators whose matrix elements must be computed and stored. We illustrate the momentum-space DMRG using the Hubbard model and discuss its application in other interacting fermion models. Finally, we introduce a DMRG scheme for optimizing the single-particle basis states and their order simultaneously in a more general basis space without momentum conservation.
Implementation of symmetries can significantly enhance the expression power of DMRG and effectively allow us to retain more basis states in the DMRG calculation. This chapter discusses the skills of imposing symmetries, including spin reflection, spatial reflection, continuous U(1) or other symmetries with additive quantum numbers, and non-Abelian SU(2) symmetry, in a DMRG calculation.
This chapter introduces the quantum transfer matrix renormalization group (QTMRG). It is a method of studying the thermodynamic and correlation functions of one-dimensional quantum lattice models. The RG transformation matrices are determined using the criteria presented in the preceding chapter and used to update the transfer matrix and other physical quantities. The spin-1/2 and spin-1 antiferromagnetic Heisenberg models are used to demonstrate the accuracy and efficiency of the method.
This chapter introduces the density matrix renormalization group (DMRG) in real space. The infinite and finite lattice algorithms of DMRG, and the approaches for targeting more than one eigenstate and for implementing DMRG in two dimensions by mapping a two-dimensional lattice onto a one-dimensional one, are discussed. The one-dimensional antiferromagnetic Heisenberg model of both integer and half-integer spins is used to demonstrate the method.
This chapter introduces two kinds of RG methods for solving the leading eigenvalue and eigenvectors of a transfer matrix: TMRG (transfer matrix renormalization group) and CTMRG (corner transfer matrix renormalization group). These methods are developed to study the thermodynamic properties of two-dimensional classical statistical models. Furthermore, in the framework of MPS, the fixed-point equations of these methods are derived, and the steps for efficiently solving these equations are outlined.
This chapter reformulates QTMRG using the language of MPS and introduces the concept of bicanonical MPS and the method of biorthogonalization. The fixed-point equations for determining the local tensors of MPS in a translation-invariant system of one or more than one site in a unit cell are derived. The steps for solving these equations in the scheme of biorthonormalization are discussed.
This chapter starts with an introductory survey on the physical background and historical events that lead to the emergence of the density matrix renormalization group (DMRG) and its tensor network generalization. We then briefly overview the major progress on the renormalization group methods of tensor networks and their applications in the past three decades. The tensor network renormalization was initially developed to solve quantum many-body problems, but its application field has grown constantly. It has now become an irreplaceable tool for investigating strongly correlated problems, statistical physics, quantum information, quantum chemistry, and artificial intelligence.
This chapter introduces the tensor network representation of physical operators, especially the matrix product representation of model Hamiltonians, called the matrix product operators (MPO), and the quantum transfer matrix representation of partition functions with different boundary conditions or with an impurity. The leading eigenvalue and eigenvectors of the quantum transfer matrix determine all thermodynamic quantities. It allows us to investigate thermodynamics without solving the full energy spectra of the Hamiltonian.
This chapter introduces the tensor network ansatz for a quantum state whose entanglement entropy obeys the so-called area law with or without logarithmic corrections. This ansatz represents a quantum many-body wave function by a network product of local tensors defined on the lattice sites and treats all tensor elements as variational parameters. It includes, for example, one-dimensional matrix product states (MPS) and two-dimensional projected entangled pair states (PEPS) or projected entangled simplex states (PESS). A typical example is the spin-1 AKLT chain, whose ground state can be exactly represented as an MPS. If a logarithmic correction to the entanglement area law emerges, a tensor network state termed the multi-scale entanglement renormalization ansatz (MERA) describes the entanglement structure of the ground state more accurately in one dimension.
This chapter presents the methods of calculating dynamical correlation functions, including the continued-fraction expansion, Lanczos-DMRG, Lanczos-MPS, Chebyshev-MPS, correction vector, conjugate gradient, and dynamical DMRG methods. In the practical application of the Lanczos-MPS or Chebyshev-MPS method, a reorthogonalization scheme is introduced to optimize all the MPS generated with these methods. As an example of application, we investigate the dynamical spectra of the spin-1/2 Heisenberg model using the Chebyshev-MPS method.
This chapter discusses the methods of solving PEPS or other two-dimensional tensor network states, including variational optimization and the annealing simulation. The variation optimization determines the local terms by minimizing the ground-state energy. The annealing simulation takes the full or simple update strategy to filter out the ground state through the imaginary time evolution. The nonlinear effect arises in evaluating the derivative of uniform PEPS and is avoided by utilizing automatic differentiation. Both variational optimization and the annealing simulation involve a contraction of double-layer tensor network states. This contraction is the primary technical barrier in the study of PEPS. A nested tensor network approach is introduced to combat this difficulty.