Book contents
- Frontmatter
- Contents
- Preface
- Part I Background
- Part II Applications, tools, and tasks
- Interlude — Good practices for scientific computing
- Part III Fundamentals
- Chapter 21 Networks demand network thinking: the friendship paradox
- Chapter 22 Network models
- Chapter 23 Statistical models and inference
- Chapter 24 Uncertainty quantification and error analysis
- Chapter 25 Ghost in the matrix: spectral methods for networks
- Chapter 26 Embedding and machine learning
- Chapter 27 Big data and scalability
- Conclusion
- Bibliography
- Index
Chapter 25 - Ghost in the matrix: spectral methods for networks
from Part III - Fundamentals
Published online by Cambridge University Press: 06 June 2024
- Frontmatter
- Contents
- Preface
- Part I Background
- Part II Applications, tools, and tasks
- Interlude — Good practices for scientific computing
- Part III Fundamentals
- Chapter 21 Networks demand network thinking: the friendship paradox
- Chapter 22 Network models
- Chapter 23 Statistical models and inference
- Chapter 24 Uncertainty quantification and error analysis
- Chapter 25 Ghost in the matrix: spectral methods for networks
- Chapter 26 Embedding and machine learning
- Chapter 27 Big data and scalability
- Conclusion
- Bibliography
- Index
Summary
Every network has a corresponding matrix representation. This is powerful. We can leverage tools from linear algebra within network science, and doing so brings great insights. The branch of graph theory concerned with such connections is called spectral graph theory. This chapter will introduce some of its central principles as we explore tools and techniques that use matrices and spectral analysis to work with network data. Many matrices appear in different cases when studying networks, including the modularity matrix, nonbacktracking matrix, and the precision matrix. But one matrix stands out—the graph Laplacian. Not only does it capture dynamical processes unfolding over a networks structure, its spectral properties have deep connections to that structure. We show many relationships between the Laplacians eigendecomposition and network problems, such as graph bisection and optimal partitioning tasks. Combining the dynamical information and the connections with partitioning also motivates spectral clustering, a powerful and successful way to find groups of data in general. This kind of technique is now at the heart of machine learning, which well explore soon.
Keywords
- Type
- Chapter
- Information
- Working with Network DataA Data Science Perspective, pp. 397 - 428Publisher: Cambridge University PressPrint publication year: 2024