The Living Thing / Notebooks :

Divisibility, decomposability, stability

Ways of slicing randomness

TODO: Clarify what it means for discrete- vs continuous-valued RVs.

TODO: all of these are about sums; but presumably we can construct this over other algebraic structures of distributions, e.g. max-stable processes.

Infinitely divisible

The Lévy process quality.

a probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed random variables. i.e. The distribution \(F\) is infintely divisible if, for every positive integer \(n\), there exist \(n\) i.i.d. RVs whose sum

\[X_1 + \dots + X_n = S_n \sim F\]

Finitely divisibile

Presumably if it doesn’t work for every \(n\) you have finite divisibility, but I haven’t seen this one in use.


The distribution of \(X\) is decomposable if there are 2 or more non-constant RVs, not necessarily in the same family, whose sum has this distribution. Not a very strong property, but the cases where an RV fails to possess even this are interesting.


Decomposable, but the components must be in the same family.


A distribution or a random variable is said to be stable if a linear combination of two independent copies of a random sample has the same distribution, up to location and scale parameters.

This induces at least 2 families of infinitely divisible distributions, the discrete and continuous stable family; Casually put, the former generalises Poisson RVs, and the latter, Gaussians.

See HaSt93.

A well-known distribution construction: the stable distribution class.

For continuous-valued continuous/discrete indexed stochastic process, \(X(t)\(,\))-stability implies that the law of the value of the process at certain times satisfies a stability equation \[X(a) \triangleq W^{1/\alpha}X(b),\] where \(0 < a < b\), \(\alpha> 0\) and \(W\sim \mathrm{Unif}([0,1])\perp X\).

The marginal distributions of such processes are those of the \(\alpha\(-stable processes. For\)=2) we have Gaussians and for \(\alpha=1\), the Cauchy law.

(NB – how does this reduce to the usual linear-combination formulation?)