Article contents
Analysis of a phase transition phenomenon in packet networks
Published online by Cambridge University Press: 01 July 2016
Abstract
The multiplexing of variable bit rate traffic streams in a packet network gives rise to two types of queueing. On a small time-scale, the rates at which the sources send is more or less constant, but there is queueing due to simultaneous packet arrivals (packet-level effect). On a somewhat larger time-scale, queueing is the result of a relatively high number of sources sending at a rate that is higher than their average rate (burst-level effect). This paper explores these effects. In particular, we give asymptotics of the overflow probability in the combined packet/burst scale model. It is shown that there is a specific size of the buffer (i.e. the ‘critical buffer size’) below which packet-scale effects are dominant, and above which burst-scale effects essentially determine the performance—strikingly, there is a sharp demarcation: theso-called ‘phase transition’. The results are asymptotic in the number of sources n. We scale buffer space B and link rate C by n, to nb and nc, respectively; then we let n grow large. Applying large deviations theory we show that in this regime the overflow probability decays exponentially in the number of sources n. For small buffers the corresponding decay rate can be calculated explicitly, for large buffers we derive an asymptote (linear in b). The results for small and large buffers give rise to an approximation for the decay rate (with general b), as well as for the critical buffer size. A numerical example (multiplexing of voice streams) confirms the accuracy of these approximations.
Keywords
MSC classification
- Type
- General Applied Probability
- Information
- Copyright
- Copyright © Applied Probability Trust 2001
References
- 2
- Cited by