The Living Thing / Notebooks :

Lévy processes

\(\renewcommand{\var}{\operatorname{Var}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\pd}{\partial} \renewcommand{\bb}[1]{\mathbb{#1}} \renewcommand{\bf}[1]{\mathbf{#1}} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\gvn}{\mid} \renewcommand{\II}{\mathbb{I}}\)

Stochastic processes with i.i.d. increments over disjoint intervals of the same length.

Specific examples of interest include Gamma processes, Brownian Motions, continuous time branching processes…

Let’s start with George Lowther:

Continuous-time stochastic processes with stationary independent increments are known as Lévy processes. […]it was seen that processes with independent increments are described by three terms — the covariance structure of the Brownian motion component, a drift term, and a measure describing the rate at which jumps occur. Being a special case of independent increments processes, the situation with Lévy processes is similar.[…]

A d-dimensional Lévy process \(\Lambda(\cdot)\) is a stochastic process indexed by \(\bb{R}\) taking values in \({\mathbb R}^d\) such that it possesses

  1. independent increments: \(\Lambda(t)-\Lambda(s)\) is independent of \(\{\Lambda(u)\colon u\le s\}\) for any \(s<t.\)

  2. stationary increments: \(\Lambda({s+t})-\Lambda(s)\) has the same distribution as \(\Lambda(t)-\Lambda(0)\) for any \(s,t>0.\)

  3. continuity in probability: \(\Lambda(s)\rightarrow \Lambda(t)\) in probability as \(s\rightarrow t.\)

For a more thorough presentation, see e.g. (Appl09).

General form

TODO.

Intensity measure

TODO.

Subordinators

A subordinator is just a Lévy process which happens to be non-decreasing. i.e. A subordinator is an a.s. non-decreasing stochastic process \(\Lambda(t), t \in \mathbb{R}\) with state space \(\mathbb{R}\) such that

\[ \mathbb{P}(\Lambda(t)-\Lambda(s)\lt 0)=0, \,\forall t \geq s \]

The first three are standard Lévy process stuff. The last is only for subordinators.

Some definitions additionally require the increment distribution is a.s. positive, rather than non-negative, or that there are no atoms at zero in the increment distribution, or no atoms at all.

One thing that everyone agrees is a subordinator is the Gamma process.

Gamma processes

Gamma processes are well studied elsewhere Applebaum, 2009, Asmussen & Glynn, 2007, Rubinstein & Kroese, 2016 and existence proofs, for example, are deferred to those other sources. You could also see Wikipedia, which I’m editing concurrently with this. We write it as \(\Lambda(t;\alpha ,\lambda )\). Wikipedia graciously tells us that a Gamma process is a pure jump process with jump intensity given \[\nu (x)=\alpha x^{-1}\exp(-\lambda x).\]

That is, the Poisson rate, with respect to “time” \(t\), of jumps whose size is in the range \([x, x+dx)\), is \(\nu(x)dx.\) We think of this as an infinite superposition of Poisson processes (covered later) driving different sized jumps, where the jumps are mostly tiny.

The marginal distribution of the an increment of duration \(t\) is given by the Gamma distribution The marginal density \(g(x;t,\alpha, \lambda )\) of the process at time \(t\) is \[ g(x;t, \alpha, \lambda) =\frac{ \lambda^{\alpha t} } { \Gamma (\alpha t) } x^{\alpha t\,-\,1}e^{-\lambda x}, x\geq 0. \] which corresponds to increments per unit time in terms of \(\bb E(\Lambda(1))=\alpha/\lambda\) and \(\var(\Lambda(1))=\alpha/\lambda^2.\)

This is recognisable as the shape-rate parameterisation of a Gamma RV, with rate \(\lambda\) and shape \(\alpha.\)

Note that if \(\alpha t=1,\) then \(X(t;\alpha ,\lambda )\sim \operatorname{Exp}(\lambda).\)

Gamma distributions

We momentarily consider the vanilla Gamma distribution to explore the use of the Gamma process.

The density \(g(x;t,\alpha, \lambda )\) of the univariate gamma is \(g(x; \alpha, \lambda)=\frac{ \lambda^{\alpha} }{ \Gamma (\alpha) } x^{\alpha\,-\,1}e^{-\lambda x}, x\geq 0.\) This is once again the shape-rate parameterisation, with rate \(\lambda\) and shape \(\alpha\) that we saw in the Gamma process. Tautologically, we can think of the Gamma distribution as the distribution at time 1 of a Gamma process.

If \(G\sim \operatorname{Gamma}(\alpha, \lambda)\) then \(\bb E(G)=\alpha/\lambda\) and \(\var(G)=\alpha/\lambda^2.\)

We use various facts about the Gamma distribution.

  1. \(\operatorname{Gamma}(1, \lambda)=\operatorname{Exp}(\lambda)\) (the exponential relation)
  2. If \(G_1\sim \operatorname{Gamma}(\alpha_1, \lambda),\,G_2\sim \operatorname{Gamma}(\alpha_2, \lambda),\) and \(G_1\perp G_2,\) then \(G_1+G_2\sim \operatorname{Gamma}(\alpha_1+\alpha_2, \lambda)\) (the additive rule)
  3. If \(G\sim \operatorname{Gamma}(\alpha, \lambda)\) then \(cG\sim \operatorname{Gamma}(\alpha, \lambda/c)\) (the multiplicative rule)
  4. If \(G_1\sim \operatorname{Gamma}(\alpha_1, \lambda)\perp G_1\sim \operatorname{Gamma}(\alpha_2, \lambda)\) then \(\frac{G_1}{G_1+G_2}\sim \operatorname{Beta}(\alpha_1, \alpha_2)\) independent of \(G_1+G_2\) (the stick-breaking rule)

The actual Gamma process

Returning to the Gamma process again. The univariate Gamma process is a Lévy (i.e, independent-increment) process, which we parameterise once again by \(\alpha, \lambda.\) More generally, this leads to a method for simulating the paths of a gamma bridge at a sequence of increasing times, \(\{t_1, t_2, t_3, t_L\}.\) Given \(\Lambda(t_1;\alpha, \lambda),\) we know that the increments are distributed as independent variates \(G_i:=\Lambda(t_{i+1})-\Lambda(t_{i})\sim \operatorname{Gamma}(\alpha(t_{i+1}-t_{i}), \lambda)\). Presuming we may simulate from the Gamma distribution, it follows that \[\Lambda(t_i)=\sum_{j \lt i}\left( \Lambda(t_{i+1})-\Lambda(t_{i})\right)=\sum_{j \lt i} G_j.\]

A \(d\)-dimensional gamma process is simply the concatenation of \(d\) independent univariate Gamma processes.

Gamma bridge

Consider a univariate gamma process, \(\Lambda(t)\) with \(\Lambda(0)=0.\) The gamma bridge, analogous to the Brownian bridge, is the gamma process conditional upon attaining a fixed the value \(S=\Lambda(1)\) at terminal time \(1.\) We write \(\Lambda_{S}:=\{\Lambda(t)\mid \Lambda(1)=S\}_{0\lt t \lt 1}\) for the paths of this process.

We can simulate from the Gamma bridge easily. Given the increments of the process are independent, if we have a gamma process \(G\) on the index set \([0,1]\) such that \(\Lambda(1)=S\), then we can simulate from the bridge paths which connect these points at intermediate time \(t,\, 0<t<1\) by recalling that we have known distributions for the increments; in particular \(\Lambda(t)\sim\operatorname{Gamma}(\alpha, \lambda)\) and \(\Lambda(1)-\Lambda(t)\sim\operatorname{Gamma}(\alpha (1-t), \lambda)\) and these increments, as increments over disjoints sets, are themselves independent. Then, by the stick breaking rule, \[\frac{\Lambda(t)}{\Lambda(1)}\sim\operatorname{Beta}(\alpha t, \alpha(1-t))\] independent of \(\Lambda(1).\) We can therefore draw from the bridge \(\Lambda_{S}(t)\) for some \(t\lt 1\) by simulating \(\Lambda_{S}(t)=B S,\) where \(B\sim \operatorname{Beta}(\alpha (t),\alpha (1-t)).\)

Poisson process

The Poisson process is the stochastic process whose inter-occurrence times are identically and independently distributed such that \(t_i-t_{i-1}\sim\textrm{Exp}(1/\lambda)\). By convention, we set all \(t_i\geq 0\) and \(N(0)=0\) a.s. The counting process that increases for each even is what we usually graph when sketching this process. \(N: \mathbb{R}\mapsto\mathbb{Z}^+\) such that \(N(t)\equiv \sum_{i=1}^N\mathbb{I}_{\{t_i<t\}}\). It is easy to show that \(N(t)\sim\textrm{Pois}(\lambda t)\), where the pmf of a Poisson RV is \[f(k;\lambda)={\frac {\lambda^{n}}{n!}}e^{-\lambda}.\]

It is a standard result that the increments of such processes over disjoint intervals such process have a Poisson distribution. For \(t_j \geq t_i\): \[N(t_j)-N(t_i) \sim \textrm{Pois}\left((t_j-t_i)\lambda\right).\]

Note also the standard result that \[\lambda:=\lim_{h\to 0} \frac{\mathrm E\left(N(t,t+h)\right)}{h}.\]

We call \(\lambda\) the rate.

Poisson bridge

Suppose we are given the value of a Poisson process at time \(0\) and time \(1\) and are concerned with some \(t\in(0,1).\) We wish to know the conditional distribution \(P(N(t)|N(1)=S).\) Using the distributions of the increments and the independence property \[\begin{aligned} P[N(t)=n|N(1)=S] &=\frac{P[N(t)=n\cap N(1)=S]}{P[N(1)=S]}\\ &=\frac{P[N(t)=n\cap \tilde{N}(1-t)=S-n]}{P[N(1)=S]}\\ &=\frac{ \frac{(\lambda t)^{n}}{n!}e^{-\lambda t} \frac{(\lambda (1-t)^{S-n})}{(S-n)!}e^{-\lambda (1-t)} }{ \frac{\lambda^S }{S!}e^{-\lambda } }\\ &=\frac{(\lambda t)^{n}(\lambda (1-t))^{S-n}}{\lambda^{S}} \frac{e^{-\lambda t}e^{-\lambda (1-t)}}{e^{-\lambda}} \frac{S!}{(S-n)!n!}\\ &=(t)^{n} (1-t)^{S-n} \left(\begin{array}{cc}S\\n\end{array}\right)\\ &=\operatorname{Binom}(n;S, t)\\ \end{aligned}\] So we can simulate a point Poisson bridge at some \(t<1\) by sampling a Binomial random variable.

Refs