We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Statistical mechanics is hugely successful when applied to physical systems at thermodynamic equilibrium; however, most natural phenomena occur in nonequilibrium conditions and more sophisticated techniques are required to address this increased complexity. This second edition presents a comprehensive overview of nonequilibrium statistical physics, covering essential topics such as Langevin equations, Lévy processes, fluctuation relations, transport theory, directed percolation, kinetic roughening, and pattern formation. The first part of the book introduces the underlying theory of nonequilibrium physics, the second part develops key aspects of nonequilibrium phase transitions, and the final part covers modern applications. A pedagogical approach has been adopted for the benefit of graduate students and instructors, with clear language and detailed figures used to explain the relevant models and experimental results. With the inclusion of original material and organizational changes throughout the book, this updated edition will be an essential guide for graduate students and researchers in nonequilibrium thermodynamics.
Statistical mechanics employs the power of probability theory to shine a light upon the invisible world of matter's fundamental constituents, allowing us to accurately model the macroscopic physical properties of large ensembles of microscopic particles. This book delves into the conceptual and mathematical foundations of statistical mechanics to enhance understanding of complex physical systems and thermodynamic phenomena, whilst also providing a solid mathematical basis for further study and research in this important field. Readers will embark on a journey through important historical experiments in statistical physics and thermodynamics, exploring their intersection with modern applications, such as the thermodynamics of stars and the entropy associated with the mixing of two substances. An invaluable resource for students and researchers in physics and mathematics, this text provides numerous worked examples and exercises with full solutions, reinforcing key theoretical concepts and offering readers deeper insight into how these powerful tools are applied.
Stochastic thermodynamics has emerged as a comprehensive theoretical framework for a large class of non-equilibrium systems including molecular motors, biochemical reaction networks, colloidal particles in time-dependent laser traps, and bio-polymers under external forces. This book introduces the topic in a systematic way, beginning with a dynamical perspective on equilibrium statistical physics. Key concepts like the identification of work, heat and entropy production along individual stochastic trajectories are then developed and shown to obey various fluctuation relations beyond the well-established linear response regime. Representative applications are then discussed, including simple models of molecular motors, small chemical reaction networks, active particles, stochastic heat engines and information machines involving Maxwell demons. This book is ideal for graduate students and researchers of physics, biophysics, and physical chemistry, with an interest in non-equilibrium phenomena.
This extensive revision of the 2007 book 'Random Graph Dynamics,' covering the current state of mathematical research in the field, is ideal for researchers and graduate students. It considers a small number of types of graphs, primarily the configuration model and inhomogeneous random graphs. However, it investigates a wide variety of dynamics. The author describes results for the convergence to equilibrium for random walks on random graphs as well as topics that have emerged as mature research areas since the publication of the first edition, such as epidemics, the contact process, voter models, and coalescing random walk. Chapter 8 discusses a new challenging and largely uncharted direction: systems in which the graph and the states of their vertices coevolve.
Drawing examples from real-world networks, this essential book traces the methods behind network analysis and explains how network data is first gathered, then processed and interpreted. The text will equip you with a toolbox of diverse methods and data modelling approaches, allowing you to quickly start making your own calculations on a huge variety of networked systems. This book sets you up to succeed, addressing the questions of what you need to know and what to do with it, when beginning to work with network data. The hands-on approach adopted throughout means that beginners quickly become capable practitioners, guided by a wealth of interesting examples that demonstrate key concepts. Exercises using real-world data extend and deepen your understanding, and develop effective working patterns in network calculations and analysis. Suitable for both graduate students and researchers across a range of disciplines, this novel text provides a fast-track to network data expertise.
Tangles offer a precise way to identify structure in imprecise data. By grouping qualities that often occur together, they not only reveal clusters of things but also types of their qualities: types of political views, of texts, of health conditions, or of proteins. Tangles offer a new, structural, approach to artificial intelligence that can help us understand, classify, and predict complex phenomena.This has become possible by the recent axiomatization of the mathematical theory of tangles, which has made it applicable far beyond its origin in graph theory: from clustering in data science and machine learning to predicting customer behaviour in economics; from DNA sequencing and drug development to text and image analysis. Such applications are explored here for the first time. Assuming only basic undergraduate mathematics, the theory of tangles and its potential implications are made accessible to scientists, computer scientists, and social scientists.
In creatures ranging from birds to fish to wildebeest, we observe the collective and coherent motion of large numbers of organisms, known as 'flocking.' John Toner, one of the founders of the field of active matter, uses the hydrodynamic theory of flocking to explain why a crowd of people can all walk, but not point, in the same direction. Assuming a basic undergraduate-level understanding of statistical mechanics, the text introduces readers to dry active matter and describes the current status of this rapidly developing field. Through the application of powerful techniques from theoretical condensed matter physics, such as hydrodynamic theories, the gradient expansion, and the renormalization group, readers are given the knowledge and tools to explore and understand this exciting field of research. This book will be valuable to graduate students and researchers in physics, mathematics, and biology with an interest in the hydrodynamic theory of flocking.
Renormalization group theory of tensor network states provides a powerful tool for studying quantum many-body problems and a new paradigm for understanding entangled structures of complex systems. In recent decades the theory has rapidly evolved into a universal framework and language employed by researchers in fields ranging from condensed matter theory to machine learning. This book presents a pedagogical and comprehensive introduction to this field for the first time. After an introductory survey on the major advances in tensor network algorithms and their applications, it introduces step-by-step the tensor network representations of quantum states and the tensor-network renormalization group methods developed over the past three decades. Basic statistical and condensed matter physics models are used to demonstrate how the tensor network renormalization works. An accessible primer for scientists and engineers, this book would also be ideal as a reference text for a graduate course in this area.
Many real-life systems are dynamic, evolving, and intertwined. Examples of such systems displaying 'complexity', can be found in a wide variety of contexts ranging from economics to biology, to the environmental and physical sciences. The study of complex systems involves analysis and interpretation of vast quantities of data, which necessitates the application of many classical and modern tools and techniques from statistics, network science, machine learning, and agent-based modelling. Drawing from the latest research, this self-contained and pedagogical text describes some of the most important and widely used methods, emphasising both empirical and theoretical approaches. More broadly, this book provides an accessible guide to a data-driven toolkit for scientists, engineers, and social scientists who require effective analysis of large quantities of data, whether that be related to social networks, financial markets, economies or other types of complex systems.
The Cambridge Manual to Archaeological Network Science provides the first comprehensive guide to a field of research that has firmly established itself within archaeological practice in recent years. Network science methods are commonly used to explore big archaeological datasets and are essential for the formal study of past relational phenomena: social networks, transport systems, communication, and exchange. The volume offers a step-by-step description of network science methods and explores its theoretical foundations and applications in archaeological research, which are elaborately illustrated with archaeological examples. It also covers a vast range of network science techniques that can enhance archaeological research, including network data collection and management, exploratory network analysis, sampling issues and sensitivity analysis, spatial networks, and network visualisation. An essential reference handbook for both beginning and experienced archaeological network researchers, the volume includes boxes with definitions, boxed examples, exercises, and online supplementary learning and teaching materials.
This text on the theory and applications of network science is aimed at beginning graduate students in statistics, data science, computer science, machine learning, and mathematics, as well as advanced students in business, computational biology, physics, social science, and engineering working with large, complex relational data sets. It provides an exciting array of analysis tools, including probability models, graph theory, and computational algorithms, exposing students to ways of thinking about types of data that are different from typical statistical data. Concepts are demonstrated in the context of real applications, such as relationships between financial institutions, between genes or proteins, between neurons in the brain, and between terrorist groups. Methods and models described in detail include random graph models, percolation processes, methods for sampling from huge networks, network partitioning, and community detection. In addition to static networks the book introduces dynamic networks such as epidemics, where time is an important component.
The epilepsies are devastating neurological disorders for which progress developing effective new therapies has slowed over recent decades, primarily due to the complexity of the brain at all scales. This reality has shifted the focus of experimental and clinical practice toward complex systems approaches to overcoming current barriers. Organized by scale from genes to whole brain, the chapters of this book survey the theoretical underpinnings and use of network and dynamical systems approaches to interpreting and modeling experimental and clinical data in epilepsy. The emphasis throughout is on the value of the non-trivial, and often counterintuitive, properties of complex systems, and how to leverage these properties to elaborate mechanisms of epilepsy and develop new therapies. In this essential book, readers will learn key concepts of complex systems theory applied across multiple scales and how each of these scales connects to epilepsy.
Ecosystems, the human brain, ant colonies, and economic networks are all complex systems displaying collective behaviour, or emergence, beyond the sum of their parts. Complexity science is the systematic investigation of these emergent phenomena, and stretches across disciplines, from physics and mathematics, to biological and social sciences. This introductory textbook provides detailed coverage of this rapidly growing field, accommodating readers from a variety of backgrounds, and with varying levels of mathematical skill. Part I presents the underlying principles of complexity science, to ensure students have a solid understanding of the conceptual framework. The second part introduces the key mathematical tools central to complexity science, gradually developing the mathematical formalism, with more advanced material provided in boxes. A broad range of end of chapter problems and extended projects offer opportunities for homework assignments and student research projects, with solutions available to instructors online. Key terms are highlighted in bold and listed in a glossary for easy reference, while annotated reading lists offer the option for extended reading and research.
This book provides a comprehensive and self-contained overview of recent progress in nonequilibrium statistical mechanics, in particular, the discovery of fluctuation relations and other time-reversal symmetry relations. The significance of these advances is that nonequilibrium statistical physics is no longer restricted to the linear regimes close to equilibrium, but extends to fully nonlinear regimes. These important new results have inspired the development of a unifying framework for describing both the microscopic dynamics of collections of particles, and the macroscopic hydrodynamics and thermodynamics of matter itself. The book discusses the significance of this theoretical framework in relation to a broad range of nonequilibrium processes, from the nanoscale to the macroscale, and is essential reading for researchers and graduate students in statistical physics, theoretical chemistry and biological physics.
This textbook establishes a theoretical framework for understanding deep learning models of practical relevance. With an approach that borrows from theoretical physics, Roberts and Yaida provide clear and pedagogical explanations of how realistic deep neural networks actually work. To make results from the theoretical forefront accessible, the authors eschew the subject's traditional emphasis on intimidating formality without sacrificing accuracy. Straightforward and approachable, this volume balances detailed first-principle derivations of novel results with insight and intuition for theorists and practitioners alike. This self-contained textbook is ideal for students and researchers interested in artificial intelligence with minimal prerequisites of linear algebra, calculus, and informal probability theory, and it can easily fill a semester-long course on deep learning theory. For the first time, the exciting practical advances in modern artificial intelligence capabilities can be matched with a set of effective principles, providing a timeless blueprint for theoretical research in deep learning.
Since the early eighteenth century, the theory of networks and graphs has matured into an indispensable tool for describing countless real-world phenomena. However, the study of large-scale features of a network often requires unrealistic limits, such as taking the network size to infinity or assuming a continuum. These asymptotic and analytic approaches can significantly diverge from real or simulated networks when applied at the finite scales of real-world applications. This book offers an approach to overcoming these limitations by introducing operator graph theory, an exact, non-asymptotic set of tools combining graph theory with operator calculus. The book is intended for mathematicians, physicists, and other scientists interested in discrete finite systems and their graph-theoretical description, and in delineating the abstract algebraic structures that characterise such systems. All the necessary background on graph theory and operator calculus is included for readers to understand the potential applications of operator graph theory.
Data assimilation is a hugely important mathematical technique, relevant in fields as diverse as geophysics, data science, and neuroscience. This modern book provides an authoritative treatment of the field as it relates to several scientific disciplines, with a particular emphasis on recent developments from machine learning and its role in the optimisation of data assimilation. Underlying theory from statistical physics, such as path integrals and Monte Carlo methods, are developed in the text as a basis for data assimilation, and the author then explores examples from current multidisciplinary research such as the modelling of shallow water systems, ocean dynamics, and neuronal dynamics in the avian brain. The theory of data assimilation and machine learning is introduced in an accessible and unified manner, and the book is suitable for undergraduate and graduate students from science and engineering without specialized experience of statistical physics.
Statistical physics examines the collective properties of large ensembles of particles, and is a powerful theoretical tool with important applications across many different scientific disciplines. This book provides a detailed introduction to classical and quantum statistical physics, including links to topics at the frontiers of current research. The first part of the book introduces classical ensembles, provides an extensive review of quantum mechanics, and explains how their combination leads directly to the theory of Bose and Fermi gases. This allows a detailed analysis of the quantum properties of matter, and introduces the exotic features of vacuum fluctuations. The second part discusses more advanced topics such as the two-dimensional Ising model and quantum spin chains. This modern text is ideal for advanced undergraduate and graduate students interested in the role of statistical physics in current research. 140 homework problems reinforce key concepts and further develop readers' understanding of the subject.
Dealing with all aspects of Monte Carlo simulation of complex physical systems encountered in condensed matter physics and statistical mechanics, this book provides an introduction to computer simulations in physics. The 5th edition contains extensive new material describing numerous powerful algorithms and methods that represent recent developments in the field. New topics such as active matter and machine learning are also introduced. Throughout, there are many applications, examples, recipes, case studies, and exercises to help the reader fully comprehend the material. This book is ideal for graduate students and researchers, both in academia and industry, who want to learn techniques that have become a third tool of physical science, complementing experiment and analytical theory.
This modern and self-contained book offers a clear and accessible introduction to the important topic of machine learning with neural networks. In addition to describing the mathematical principles of the topic, and its historical evolution, strong connections are drawn with underlying methods from statistical physics and current applications within science and engineering. Closely based around a well-established undergraduate course, this pedagogical text provides a solid understanding of the key aspects of modern machine learning with artificial neural networks, for students in physics, mathematics, and engineering. Numerous exercises expand and reinforce key concepts within the book and allow students to hone their programming skills. Frequent references to current research develop a detailed perspective on the state-of-the-art in machine learning research.