Assume that we want to estimate – σ, the abscissa of convergence of the Laplace transform. We show that no non-parametric estimator of σ can converge at a faster rate than (log n)–1, where n is the sample size. An optimal convergence rate is achieved by an estimator of the form where xn = O(log n) and is the mean of the sample values overshooting xn. Under further parametric restrictions this (log n)–1 phenomenon is also illustrated by a weak convergence result.