The Living Thing / Notebooks :

Gamma processes

Usefulness: 🔧 🔧
Novelty: 💡
Uncertainty: 🤪 🤪
Incompleteness: 🚧 🚧

\(\renewcommand{\var}{\operatorname{Var}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\pd}{\partial} \renewcommand{\bb}[1]{\mathbb{#1}} \renewcommand{\bf}[1]{\mathbf{#1}} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\gvn}{\mid} \renewcommand{\II}{\mathbb{I}}\)

Gamma processes are the classic model of subordinators, i.e. non-decreasing Lévy processes.

Tutorial introductions to Gamma processes can be found in in (Applebaum 2009; Asmussen and Glynn 2007; Rubinstein and Kroese 2016; Kyprianou 2014). Existence proofs etc are deferred to those sources. You could also see Wikipedia.

As a Lévy process

Wikipedia tells us that a Gamma process is a pure jump process with jump intensity given

\[\nu (x)=\alpha x^{-1}\exp(-\lambda x).\]

That is, the Poisson rate, with respect to “time” \(t\), of jumps whose size is in the range \([x, x+dx)\), is \(\nu(x)dx.\) We think of this as an infinite superposition of Poisson processes driving different sized jumps, where the jumps are mostly tiny.

This is how we think about these processes in terms of Lévy process theory, at least; we tend to set them up differently for statistical inference, in terms of the distribution of the process and its increments.

The marginal distribution of the an increment of duration \(t\) is given by the Gamma distribution, which we had better cover first.

Gamma distributions

Let us take a brief divergence into the vanilla Gamma distribution which induces Gamma process.

The density \(g(x;t,\alpha, \lambda )\) of the univariate gamma is

\[ g(x; \alpha, \lambda)= \frac{ \lambda^{\alpha} }{ \Gamma (\alpha) } x^{\alpha\,-\,1}e^{-\lambda x}, x\geq 0. \] This is once again the shape-rate parameterisation, with rate \(\lambda\) and shape \(\alpha,\). We can think of the Gamma distribution as the distribution at time 1 of a Gamma process.

If \(G\sim \operatorname{Gamma}(\alpha, \lambda)\) then \(\bb E(G)=\alpha/\lambda\) and \(\var(G)=\alpha/\lambda^2.\)

We use various facts about the Gamma distribution which quantify its divisibility properties.

  1. If \(G_1\sim \operatorname{Gamma}(\alpha_1, \lambda),\,G_2\sim \operatorname{Gamma}(\alpha_2, \lambda),\) and \(G_1\perp G_2,\) then \(G_1+G_2\sim \operatorname{Gamma}(\alpha_1+\alpha_2, \lambda)\) (additive rule)
  2. If \(G\sim \operatorname{Gamma}(\alpha, \lambda)\) then \(cG\sim \operatorname{Gamma}(\alpha, \lambda/c)\) (multiplicative rule)
  3. If \(G_1\sim \operatorname{Gamma}(\alpha_1, \lambda)\perp G_1\sim \operatorname{Gamma}(\alpha_2, \lambda)\) then \(\frac{G_1}{G_1+G_2}\sim \operatorname{Beta}(\alpha_1, \alpha_2)\) independent of \(G_1+G_2\) (stick-breaking rule)

The Gamma process

The univariate Gamma process \(\{\Lambda(t;\alpha,\lambda\}\) is an independent-increment process, with time index \(t\) and parameters by \(\alpha, \lambda.\)

The marginal density \(g(x;t,\alpha, \lambda )\) of the process at time \(t\) is a Gamma RV, specifically,

\[ g(x;t, \alpha, \lambda) =\frac{ \lambda^{\alpha t} } { \Gamma (\alpha t) } x^{\alpha t\,-\,1}e^{-\lambda x}, x\geq 0. \] That is, \(\Lambda(t) \sim \operatorname{Gamma}(\alpha(t_{i+1}-t_{i}), \lambda)\).

which corresponds to increments per unit time in terms of \(\bb E(\Lambda(1))=\alpha/\lambda\) and \(\var(\Lambda(1))=\alpha/\lambda^2.\)

Note that if \(\alpha t=1,\) then \(X(t;\alpha ,\lambda )\sim \operatorname{Exp}(\lambda).\)

This leads to a method for simulating a path of a gamma process at a sequence of increasing times, \(\{t_1, t_2, t_3, t_L\}.\) Given \(\Lambda(t_1;\alpha, \lambda),\) we know that the increments are distributed as independent variates \(G_i:=\Lambda(t_{i+1})-\Lambda(t_{i})\sim \operatorname{Gamma}(\alpha(t_{i+1}-t_{i}), \lambda)\). Presuming we may simulate from the Gamma distribution, it follows that

\[\Lambda(t_i)=\sum_{j \lt i}\left( \Lambda(t_{i+1})-\Lambda(t_{i})\right)=\sum_{j \lt i} G_j.\]

A standard \(d\)-dimensional gamma process is the concatenation of \(d\) independent univariate Gamma processes.

Gamma bridge

Consider a univariate gamma process, \(\Lambda(t)\) with \(\Lambda(0)=0.\) The Gamma bridge, analogous to the Brownian bridge, is the Gamma process conditional upon attaining a fixed the value \(S=\Lambda(1)\) at terminal time \(1.\) We write \(\Lambda_{S}:=\{\Lambda(t)\mid \Lambda(1)=S\}_{0\lt t \lt 1}\) for the paths of this process.

We can simulate from the Gamma bridge easily. Given the increments of the process are independent, if we have a gamma process \(G\) on the index set \([0,1]\) such that \(\Lambda(1)=S\), then we can simulate from the bridge paths which connect these points at intermediate time \(t,\, 0<t<1\) by recalling that we have known distributions for the increments; in particular \(\Lambda(t)\sim\operatorname{Gamma}(\alpha, \lambda)\) and \(\Lambda(1)-\Lambda(t)\sim\operatorname{Gamma}(\alpha (1-t), \lambda)\) and these increments, as increments over disjoints sets, are themselves independent. Then, by the stick breaking rule,

\[\frac{\Lambda(t)}{\Lambda(1)}\sim\operatorname{Beta}(\alpha t, \alpha(1-t))\] independent of \(\Lambda(1).\) We can therefore sample from a path of the bridge \(\Lambda_{S}(t)\) for some \(t\lt 1\) by simulating \(\Lambda_{S}(t)=B S,\) where \(B\sim \operatorname{Beta}(\alpha (t),\alpha (1-t)).\)

Gamma processes with dependent components

I am not sure what general correlations are possible here, but one obvious one is to choose a transform matrix \(M\) with non-negative entries. Then the process \(\{M\Lambda(t)\}\) is still marginally a Gamma process, but the components of the vector are no longer independent. Is this the most general possible Gamma process? What is the covariance structure of that process? 🚧

Refs

Applebaum, David. 2004. “Lévy Processes—from Probability to Finance and Quantum Groups.” Notices of the AMS 51 (11): 12.

———. 2009. Lévy Processes and Stochastic Calculus. 2nd ed. Cambridge Studies in Advanced Mathematics 116. Cambridge ; New York: Cambridge University Press.

Asmussen, Søren, and Peter W. Glynn. 2007. Stochastic Simulation: Algorithms and Analysis. 2007 edition. New York: Springer.

Avramidis, Athanassios N., Pierre L’Ecuyer, and Pierre-Alexandre Tremblay. 2003. “New Simulation Methodology for Finance: Efficient Simulation of Gamma and Variance-Gamma Processes.” In Proceedings of the 35th Conference on Winter Simulation: Driving Innovation, 319–26. WSC ’03. New Orleans, Louisiana: Winter Simulation Conference. http://www-perso.iro.umontreal.ca/~lecuyer/myftp/papers/wsc03vg.pdf.

Barndorff-Nielsen, Ole E., Makoto Maejima, and Ken-Iti Sato. 2006. “Some Classes of Multivariate Infinitely Divisible Distributions Admitting Stochastic Integral Representations.” Bernoulli 12 (1): 1–33. https://projecteuclid.org/euclid.bj/1141136646.

Barndorff-Nielsen, Ole E., Jan Pedersen, and Ken-Iti Sato. 2001. “Multivariate Subordination, Self-Decomposability and Stability.” Advances in Applied Probability 33 (1): 160–87. https://doi.org/10.1017/S0001867800010685.

Bertoin, Jean. 1996. Lévy Processes. Cambridge Tracts in Mathematics 121. Cambridge ; New York: Cambridge University Press.

———. 1999. “Subordinators: Examples and Applications.” In Lectures on Probability Theory and Statistics: Ecole d’Eté de Probailités de Saint-Flour XXVII - 1997, edited by Jean Bertoin, Fabio Martinelli, Yuval Peres, and Pierre Bernard, 1717:1–91. Lecture Notes in Mathematics. Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-48115-7_1.

———. 2000. Subordinators, Lévy Processes with No Negative Jumps, and Branching Processes. University of Aarhus. Centre for Mathematical Physics and Stochastics …. http://www.maphysto.dk/oldpages/events/LevyBranch2000/notes/bertoin.pdf.

Bhattacharya, Rabi N., and Edward C. Waymire. 2009. Stochastic Processes with Applications. Society for Industrial and Applied Mathematics. http://epubs.siam.org/doi/abs/10.1137/1.9780898718997.fm.

Bondesson, Lennart. 2012. Generalized Gamma Convolutions and Related Classes of Distributions and Densities. Springer Science & Business Media. http://books.google.com?id=sBDlBwAAQBAJ.

Buchmann, Boris, Benjamin Kaehler, Ross Maller, and Alexander Szimayer. 2015. “Multivariate Subordination Using Generalised Gamma Convolutions with Applications to V.G. Processes and Option Pricing,” February. http://arxiv.org/abs/1502.03901.

Connor, Robert J., and James E. Mosimann. 1969. “Concepts of Independence for Proportions with a Generalization of the Dirichlet Distribution.” Journal of the American Statistical Association 64 (325): 194–206.

Émery, Michel, and Marc Yor. 2004. “A Parallel Between Brownian Bridges and Gamma Bridges.” Publications of the Research Institute for Mathematical Sciences 40 (3): 669–88. https://doi.org/10.2977/prims/1145475488.

Figueroa-López, José E. 2012. “Jump-Diffusion Models Driven by Lévy Processes.” In Handbook of Computational Finance, edited by Jin-Chuan Duan, Wolfgang Karl Härdle, and James E. Gentle, 61–88. Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-17254-0_4.

Foti, Nicholas, Joseph Futoma, Daniel Rockmore, and Sinead Williamson. 2013. “A Unifying Representation for a Class of Dependent Random Measures.” In Artificial Intelligence and Statistics, 20–28. http://proceedings.mlr.press/v31/foti13a.html.

Gusak, Dmytro, Alexander Kukush, Alexey Kulik, Yuliya Mishura, and Andrey Pilipenko. 2010. Theory of Stochastic Processes : With Applications to Financial Mathematics and Risk Theory. Problem Books in Mathematics. New York: Springer New York. http://link.springer.com/book/10.1007%2F978-0-387-87862-1.

Hackmann, Daniel, and Alexey Kuznetsov. 2016. “Approximating Lévy Processes with Completely Monotone Jumps.” The Annals of Applied Probability 26 (1): 328–59. https://doi.org/10.1214/14-AAP1093.

Ishwaran, Hemant, and Mahmoud Zarepour. 2002. “Exact and Approximate Sum Representations for the Dirichlet Process.” Canadian Journal of Statistics 30 (2): 269–83. https://doi.org/10.2307/3315951.

Itkin, Andrey. 2016. “Efficient Solution of Backward Jump-Diffusion Partial Integro-Differential Equations with Splitting and Matrix Exponentials,” January. https://papers.ssrn.com/abstract=2794898.

James, Lancelot F., Bernard Roynette, and Marc Yor. 2008. “Generalized Gamma Convolutions, Dirichlet Means, Thorin Measures, with Explicit Examples.” Probability Surveys 5: 346–415. https://doi.org/10.1214/07-PS118.

Kyprianou, Andreas E. 2014. Fluctuations of Lévy Processes with Applications: Introductory Lectures. Second edition. Universitext. Heidelberg: Springer.

Lalley, Steven P. 2007. “Lévy Processes, Stable Processes, and Subordinators.”

Lawrence, Neil D., and Raquel Urtasun. 2009. “Non-Linear Matrix Factorization with Gaussian Processes.” In Proceedings of the 26th Annual International Conference on Machine Learning, 601–8. ICML ’09. New York, NY, USA: ACM. https://doi.org/10.1145/1553374.1553452.

Lefebvre, Mario. 2007. Applied Stochastic Processes. Universitext. Springer New York. http://link.springer.com/chapter/10.1007/978-0-387-48976-6_2.

Lo, Albert Y., and Chung-Sing Weng. 1989. “On a Class of Bayesian Nonparametric Estimates: II. Hazard Rate Estimates.” Annals of the Institute of Statistical Mathematics 41 (2): 227–45. https://doi.org/10.1007/BF00049393.

Mathai, A. M., and P. G. Moschopoulos. 1991. “On a Multivariate Gamma.” Journal of Multivariate Analysis 39 (1): 135–53. https://doi.org/10.1016/0047-259X(91)90010-Y.

Mathal, A. M., and P. G. Moschopoulos. 1992. “A Form of Multivariate Gamma Distribution.” Annals of the Institute of Statistical Mathematics 44 (1): 97–106. https://doi.org/10.1007/BF00048672.

Olofsson, Peter. 2005. Probability, Statistics, and Stochastic Processes. Hoboken, N.J: Hoboken, N.J. : Wiley-Interscience. http://dx.doi.org/10.1002%2F9780471743064.

Pérez-Abreu, Victor, and Robert Stelzer. 2014. “Infinitely Divisible Multivariate and Matrix Gamma Distributions.” Journal of Multivariate Analysis 130 (September): 155–75. https://doi.org/10.1016/j.jmva.2014.04.017.

Rasmussen, Carl Edward, and Hannes Nickisch. 2010. “Gaussian Processes for Machine Learning (GPML) Toolbox.” Journal of Machine Learning Research 11 (Nov): 3011–5. http://www.jmlr.org/papers/v11/rasmussen10a.html.

Rubinstein, Reuven Y., and Dirk P. Kroese. 2016. Simulation and the Monte Carlo Method. 3 edition. Wiley Series in Probability and Statistics. Hoboken, New Jersey: Wiley.

Sato, Ken-iti, Sato Ken-Iti, and A. Katok. 1999. Lévy Processes and Infinitely Divisible Distributions. Cambridge University Press.

Semeraro, Patrizia. 2008. “A Multivariate Variance Gamma Model for Financial Applications.” International Journal of Theoretical and Applied Finance 11 (01): 1–18. https://doi.org/10.1142/S0219024908004701.

Singpurwalla, Nozer D., and Mark A. Youngren. 1993. “Multivariate Distributions Induced by Dynamic Environments.” Scandinavian Journal of Statistics 20 (3): 251–61. https://www.jstor.org/stable/4616280.

Steutel, Fred W., and Klaas van Harn. 2003. Infinite Divisibility of Probability Distributions on the Real Line. CRC Press. http://books.google.com?id=5ddskbtvVjMC.

Tankov, Peter, and Ekaterina Voltchkova. n.d. “Jump-Diffusion Models: A Practitioner’s Guide,” 24.

Veillette, Mark, and Murad S. Taqqu. 2010a. “Using Differential Equations to Obtain Joint Moments of First-Passage Times of Increasing Lévy Processes.” Statistics & Probability Letters 80 (7): 697–705. https://doi.org/10.1016/j.spl.2010.01.002.

———. 2010b. “Numerical Computation of First-Passage Times of Increasing Lévy Processes.” Methodology and Computing in Applied Probability 12 (4): 695–729. https://doi.org/10.1007/s11009-009-9158-y.

Wilson, Andrew Gordon, and Zoubin Ghahramani. 2011. “Generalised Wishart Processes.” In Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, 736–44. UAI’11. Arlington, Virginia, United States: AUAI Press. http://dl.acm.org/citation.cfm?id=3020548.3020633.

Wolpert, Robert L. 2006. “Stationary Gamma Processes,” 13.