Published online by Cambridge University Press: 21 May 2002
We consider a queue fed by a large number, say n, on–off sources with generally distributed on- and off-times. The queueing resources are scaled by n: The buffer is B ≡ nb and the link rate is C ≡ nc. The model is versatile. It allows one to model both long-range-dependent traffic (by using heavy-tailed on-periods) and short-range-dependent traffic (by using light-tailed on-periods). A crucial performance metric in this model is the steady state buffer overflow probability.
This probability decays exponentially in n. Therefore, if n grows large, naive simulation is too time-consuming and fast simulation techniques have to be used. Due to the exponential decay (in n), importance sampling with an exponential change of measure goes through, irrespective of the on-times being heavy or light tailed. An asymptotically optimal change of measure is found by using large deviations arguments. Notably, the change of measure is not constant during the simulation run, which is different from many other studies (usually relying on large buffer asymptotics).
Numerical examples show that our procedure improves considerably over naive simulation. We present accelerations, we discuss the influence of the shape of the distributions on the overflow probability, and we describe the limitations of our technique.