Book contents
- Frontmatter
- Contents
- Preface
- Unit Used
- Notations and Graphical Representations
- Abbreviations
- 1 Introduction
- 2 Basic Algebra of Tensors
- 3 Tensor Network Representation of Classical Statistical Models
- 4 Tensor Network Representation of Operators
- 5 Tensor Network Ansatz of Wave Functions
- 6 Criterion of Truncation: Symmetric Systems
- 7 Real-Space DMRG
- 8 Implementation of Symmetries
- 9 DMRG with Nonlocal Basis States
- 10 Matrix Product States
- 11 Infinite Matrix Product States
- 12 Determination of MPS
- 13 Continuous Matrix Product States
- 14 Classical Transfer Matrix Renormalization
- 15 Criterion of Truncation: Nonsymmetric Systems
- 16 Renormalization of Quantum Transfer Matrices
- 17 MPS Solution of QTMRG
- 18 Dynamical Correlation Functions
- 19 Time-Dependent Methods
- 20 Tangent-Space Approaches
- 21 Tree Tensor Network States
- 22 Two-Dimensional Tensor Network States
- 23 Coarse-Graining Tensor Renormalization
- Appendix Other Numerical Methods
- References
- Index
23 - Coarse-Graining Tensor Renormalization
Published online by Cambridge University Press: 18 January 2024
- Frontmatter
- Contents
- Preface
- Unit Used
- Notations and Graphical Representations
- Abbreviations
- 1 Introduction
- 2 Basic Algebra of Tensors
- 3 Tensor Network Representation of Classical Statistical Models
- 4 Tensor Network Representation of Operators
- 5 Tensor Network Ansatz of Wave Functions
- 6 Criterion of Truncation: Symmetric Systems
- 7 Real-Space DMRG
- 8 Implementation of Symmetries
- 9 DMRG with Nonlocal Basis States
- 10 Matrix Product States
- 11 Infinite Matrix Product States
- 12 Determination of MPS
- 13 Continuous Matrix Product States
- 14 Classical Transfer Matrix Renormalization
- 15 Criterion of Truncation: Nonsymmetric Systems
- 16 Renormalization of Quantum Transfer Matrices
- 17 MPS Solution of QTMRG
- 18 Dynamical Correlation Functions
- 19 Time-Dependent Methods
- 20 Tangent-Space Approaches
- 21 Tree Tensor Network States
- 22 Two-Dimensional Tensor Network States
- 23 Coarse-Graining Tensor Renormalization
- Appendix Other Numerical Methods
- References
- Index
Summary
Coarse-graining renormalization aims to reformulate a tensor network model with a coarse-grained one at a larger scale. It has attracted particular attention in recent years because it opens a new avenue to unveil the entanglement structure of a tensor network model under the scaling transformation. This chapter reviews and compares the tensor renormalization group (TRG) and other coarse-graining methods developed in the past two decades. The methods can be divided into two groups according to whether or not the renormalization effect of the environment tensors is incorporated in the optimization of local tensors. The local optimization methods include TRG, HOTRG (a variant of TRG based on the higher-order singular value decomposition), tensor network renormalization (TNR), and loop-TNR. The global optimization methods include the second renormalized TRG and HOTRG, referred to as SRG and HOSRG, respectively. Among all these coarse-graining methods, HOTRG and HOSRG are the only two that can be readily extended and efficiently applied to three-dimensional classical or two-dimensional quantum lattice models.
Keywords
- Type
- Chapter
- Information
- Density Matrix and Tensor Network Renormalization , pp. 364 - 393Publisher: Cambridge University PressPrint publication year: 2023