Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- Part one Pattern Classification with Binary-Output Neural Networks
- 2 The Pattern Classification Problem
- 3 The Growth Function and VC-Dimension
- 4 General Upper Bounds on Sample Complexity
- 5 General Lower Bounds on Sample Complexity
- 6 The VC-Dimension of Linear Threshold Networks
- 7 Bounding the VC-Dimension using Geometric Techniques
- 8 Vapnik-Chervonenkis Dimension Bounds for Neural Networks
- Part two Pattern Classification with Real-Output Networks
- Part three Learning Real-Valued Functions
- Part four Algorithmics
- Appendix 1 Useful Results
- Bibliography
- Author index
- Subject index
3 - The Growth Function and VC-Dimension
Published online by Cambridge University Press: 26 February 2010
- Frontmatter
- Contents
- Preface
- 1 Introduction
- Part one Pattern Classification with Binary-Output Neural Networks
- 2 The Pattern Classification Problem
- 3 The Growth Function and VC-Dimension
- 4 General Upper Bounds on Sample Complexity
- 5 General Lower Bounds on Sample Complexity
- 6 The VC-Dimension of Linear Threshold Networks
- 7 Bounding the VC-Dimension using Geometric Techniques
- 8 Vapnik-Chervonenkis Dimension Bounds for Neural Networks
- Part two Pattern Classification with Real-Output Networks
- Part three Learning Real-Valued Functions
- Part four Algorithmics
- Appendix 1 Useful Results
- Bibliography
- Author index
- Subject index
Summary
Introduction
The previous chapter gave a formal definition of the learning problem, and showed that it can be solved if the class HN of functions is finite. However, many interesting function classes are not finite. For example, the number of functions computed by the perceptron with real-valued weights and inputs is infinite. Many other neural networks can also be represented as a parameterized function class with an infinite parameter set. We shall see that learning is possible for many (but not all) function classes like this, provided the function class is not too complex. In this chapter, we examine two measures of the complexity of a function class, the growth function and the VC-dimension, and we show that these are intimately related. In the next two chapters, we shall see that the growth function and VC-dimension of a function class determine the inherent sample complexity of the learning problem.
The Growth Function
Consider a finite subset S of the input space X. For a function class H, the restriction of H to the set S (that is, the set of restrictions to S of all functions in H) is denoted by H|s. If H|s is the set of all functions from S to {0, 1}, then clearly, H is as powerful as it can be in classifying the points in S. We can view the cardinality of H|s (and in particular how it compares with 2|s|) as a measure of the classification complexity of H with respect to the set S.
- Type
- Chapter
- Information
- Neural Network LearningTheoretical Foundations, pp. 29 - 41Publisher: Cambridge University PressPrint publication year: 1999