Book contents
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
1 - Introduction
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
Summary
Chance and information
Our experience of the world leads us to conclude that many events are unpredictable and sometimes quite unexpected. These may range from the outcome of seemingly simple games such as tossing a coin and trying to guess whether it will be heads or tails to the sudden collapse of governments or the dramatic fall in prices of shares on the stock market. When we try to interpret such events, it is likely that we will take one of two approaches – we will either shrug our shoulders and say it was due to ‘chance’ or we will argue that we might have have been better able to predict, for example, the government's collapse if only we'd had more ‘information’ about the machinations of certain ministers. One of the main aims of this book is to demonstrate that these two concepts of ‘chance’ and ‘information’ are more closely related than you might think. Indeed, when faced with uncertainty our natural tendency is to search for information that will help us to reduce the uncertainty in our own minds; for example, think of the gambler about to bet on the outcome of a race and combing the sporting papers beforehand for hints about the form of the jockeys and the horses.
Before we proceed further, we should clarify our understanding of the concept of chance. It may be argued that the tossing of fair, unbiased coins is an ‘intrinsically random’ procedure in that everyone in the world is equally ignorant of whether the result will be heads or tails.
- Type
- Chapter
- Information
- Probability and InformationAn Integrated Approach, pp. 1 - 9Publisher: Cambridge University PressPrint publication year: 2008