Book contents
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
7 - Communication
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface to the second edition
- Preface to the first edition
- 1 Introduction
- 2 Combinatorics
- 3 Sets and measures
- 4 Probability
- 5 Discrete random variables
- 6 Information and entropy
- 7 Communication
- 8 Random variables with probability density functions
- 9 Random vectors
- 10 Markov chains and their entropy
- Exploring further
- Appendix 1 Proof by mathematical induction
- Appendix 2 Lagrange multipliers
- Appendix 3 Integration of exp(−½x2)
- Appendix 4 Table of probabilities associated with the standard normal distribution
- Appendix 5 A rapid review of matrix algebra
- Selected solutions
- Index
Summary
Transmission of information
In this chapter we will be trying to model the transmission of information across channels. We will begin with a very simple model, as is shown in Fig. 7.1, and then build further features into it as the chapter progresses.
The model consists of three components. A source of information, a channel across which the information is transmitted and a receiver to pick up the information at the other end. For example, the source might be a radio or TV transmitter, the receiver would then be a radio or TV and the channel the atmosphere through which the broadcast waves travel. Alternatively, the source might be a computer memory, the receiver a computer terminal and the channel the network of wires and processors which connects them. In all cases that we consider, the channel is subject to ‘noise’, that is uncontrollable random effects which have the undesirable effect of distorting the message leading to potential loss of information by the receiver.
The source is modelled by a random variable S whose values {a1, a2, …, an} are called the source alphabet. The law of S is {p1, p2, …, pn}. The fact that S is random allows us to include within our model the uncertainty of the sender concerning which message they are going to send. In this context, a message is a succession of symbols from S sent out one after the other.
- Type
- Chapter
- Information
- Probability and InformationAn Integrated Approach, pp. 127 - 154Publisher: Cambridge University PressPrint publication year: 2008