Book contents
- Frontmatter
- Contents
- List of code fragments
- Preface
- Part I Basic concepts
- Part II Pattern analysis algorithms
- 5 Elementary algorithms in feature space
- 6 Pattern analysis using eigen-decompositions
- 7 Pattern analysis using convex optimisation
- 8 Ranking, clustering and data visualisation
- Part III Constructing kernels
- Appendix A Proofs omitted from the main text
- Appendix B Notational conventions
- Appendix C List of pattern analysis methods
- Appendix D List of kernels
- References
- Index
6 - Pattern analysis using eigen-decompositions
from Part II - Pattern analysis algorithms
Published online by Cambridge University Press: 29 March 2011
- Frontmatter
- Contents
- List of code fragments
- Preface
- Part I Basic concepts
- Part II Pattern analysis algorithms
- 5 Elementary algorithms in feature space
- 6 Pattern analysis using eigen-decompositions
- 7 Pattern analysis using convex optimisation
- 8 Ranking, clustering and data visualisation
- Part III Constructing kernels
- Appendix A Proofs omitted from the main text
- Appendix B Notational conventions
- Appendix C List of pattern analysis methods
- Appendix D List of kernels
- References
- Index
Summary
The previous chapter saw the development of some basic tools for working in a kernel-defined feature space resulting in some useful algorithms and techniques. The current chapter will extend the methods in order to understand the spread of the data in the feature space. This will be followed by examining the problem of identifying correlations between input vectors and target values. Finally, we discuss the task of identifying covariances between two different representations of the same object.
All of these important problems in kernel-based pattern analysis can be reduced to performing an eigen- or generalised eigen-analysis, that is the problem of finding solutions of the equation Aw = λBw given symmetric matrices A and B. These problems range from finding a set of k directions in the embedding space containing the maximum amount of variance in the data (principal components analysis (PCA)), through finding correlations between input and output representations (partial least squares (PLS)), to finding correlations between two different representations of the same data (canonical correlation analysis (CCA)). Also the Fisher discriminant analysis from Chapter 5 can be cast as a generalised eigenvalue problem.
The importance of this class of algorithms is that the generalised eigenvectors problem provides an efficient way of optimising an important family of cost functions; it can be studied with simple linear algebra and can be solved or approximated efficiently using a number of well-known techniques from computational algebra.
- Type
- Chapter
- Information
- Kernel Methods for Pattern Analysis , pp. 140 - 194Publisher: Cambridge University PressPrint publication year: 2004
- 1
- Cited by