Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
10 - Information-Theoretic Stability and Generalization
Published online by Cambridge University Press: 22 March 2021
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Contributors
- 1 Introduction to Information Theory and Data Science.
- 2 An Information-Theoretic Approach to Analog-to-Digital Compression
- 3 Compressed Sensing via Compression Codes
- 4 Information-Theoretic Bounds on Sketching
- 5 Sample Complexity Bounds for Dictionary Learning from Vector- and Tensor-Valued Data
- 6 Uncertainty Relations and Sparse Signal Recovery
- 7 Understanding Phase Transitions via Mutual Information and MMSE
- 8 Computing Choice: Learning Distributions over Permutations
- 9 Universal Clustering
- 10 Information-Theoretic Stability and Generalization
- 11 Information Bottleneck and Representation Learning
- 12 Fundamental Limits in Model Selection for Modern Data Analysis
- 13 Statistical Problems with Planted Structures: Information-Theoretical and Computational Limits
- 14 Distributed Statistical Inference with Compressed Data
- 15 Network Functional Compression
- 16 An Introductory Guide to Fano’s Inequality with Applications in Statistical Estimation
- Index
- References
Summary
Machine-learning algorithms can be viewed as stochastic transformations that map training data to hypotheses. Following Bousquet and Elisseeff, we say such an algorithm is stable if its output does not depend too much on any individual training example. Since stability is closely connected to generalization capabilities of learning algorithms, it is of interest to obtain sharp quantitative estimates on the generalization bias of machine-learning algorithms in terms of their stability properties. We describe several information-theoretic measures of algorithmic stability and illustrate their use for upper-bounding the generalization bias of learning algorithms. Specifically, we relate the expected generalization error of a learning algorithm to several information-theoretic quantities that capture the statistical dependence between the training data and the hypothesis. These include mutual information and erasure mutual information, and their counterparts induced by the total variation distance. We illustrate the general theory through examples, including the Gibbs algorithm and differentially private algorithms, and discuss strategies for controlling the generalization error.
Keywords
- Type
- Chapter
- Information
- Information-Theoretic Methods in Data Science , pp. 302 - 329Publisher: Cambridge University PressPrint publication year: 2021
References
- 3
- Cited by