Infinite Variance

The idea that the variance of a distribution might be infinite is unfamiliar and somewhat disconcerting.  The problem should be conceived differently.  If a data sample is drawn from a Normal distribution, as the sample size increases, the sample variance should converge to the variance of the distribution.  The graphic shows serial variance calculations for sample sizes 10 to 1000 from a Normal distribution, α = 2.  γ is 1/  so the variance is expected to converge to 1.0.  As the sample size increases the variance calculation converges to the variance of the distribution.

The picture is very different for a sample with α = 1.7.  With the heavier distribution tail, large jumps lead to sudden changes in the calculated variance and it does not converge as the sample size grows.  It looks at times like it is trying to converge until another extreme event prevents the convergence.  Stable distributions nevertheless have a domain of attraction specified by α.  Although the variance may not exist, the distributions have a defined shape and probabilities can be calculated.  For α > 1 the mean exists and it appears that virtually all financial data sets have α > 1.

We have shown without getting into the mathematics the class of stable distributions, which have heavy tails and desirable characteristics for financial data.  They retain their shape under addition.  In the case of log returns, which can be added to give the return of a longer interval, we have a case where the distribution could theoretically be scaled from short intervals to longer intervals, provided the distribution parameters are constant over the varying intervals.  This is an attractive idea, but it may not necessarily be true of financial data.  For the scaling to work it, the distributions over time must not only be constant, the serial data must be random, i.e. independent.  In other sections we will apply stable distributions to actual financial data.