Book contents
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
Preface to the second edition
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
Summary
When I wrote the first edition of this book in the early 1990s it was designed as an undergraduate text which gave a unified introduction to the mathematics of ‘chance’ and ‘information’. I am delighted that many courses (mainly in Australasia and the USA) have adopted the book as a core text and have been pleased to receive so much positive feedback from both students and instructors since the book first appeared. For this second edition I have resisted the temptation to expand the existing text and most of the changes to the first nine chapters are corrections of errors and typos. The main new ingredient is the addition of a further chapter (Chapter 10) which brings a third important concept, that of ‘time’ into play via an introduction to Markov chains and their entropy. The mathematical device for combining time and chance together is called a ‘stochastic process’ which is playing an increasingly important role in mathematical modelling in such diverse (and important) areas as mathematical finance and climate science. Markov chains form a highly accessible subclass of stochastic (random) processes and nowadays these often appear in first year courses (at least in British universities). From a pedagogic perspective, the early study of Markov chains also gives students an additional insight into the importance of matrices within an applied context and this theme is stressed heavily in the approach presented here, which is based on courses taught at both Nottingham Trent and Sheffield Universities.
- Type
- Chapter
- Information
- Probability and InformationAn Integrated Approach, pp. xi - xiiPublisher: Cambridge University PressPrint publication year: 2008