The Living Thing / Notebooks :

Markov Chain Monte Carlo methods

What we usually desire ergodicity results for.

Pierre E. Jacob, John O’Leary, Yves F. Atchadé made MCMC estimators without finite-time-bias, which is nice for parallelisation. (JaOA17)

Refs

Cald14
Calderhead, B. (2014) A general construction for parallelizing Metropolis−Hastings algorithms. Proceedings of the National Academy of Sciences, 111(49), 17408–17413. DOI.
JaOA17
Jacob, P. E., O’Leary, J., & Atchadé, Y. F.(2017) Unbiased Markov chain Monte Carlo with couplings. ArXiv:1708.03625 [Stat].
Liu96
Liu, J. S.(1996) Metropolized independent sampling with comparisons to rejection sampling and importance sampling. Statistics and Computing, 6(2), 113–119. DOI.
Neal93
Neal, R. M.(1993) Probabilistic inference using Markov chain Monte Carlo methods (No. Technical Report CRGTR-93-1). . Toronto Canada: Department of Computer Science, University of Toronto,
NoFo16
Norton, R. A., & Fox, C. (2016) Tuning of MCMC with Langevin, Hamiltonian, and other stochastic autoregressive proposals. ArXiv:1610.00781 [Math, Stat].
RuKr04
Rubinstein, R. Y., & Kroese, D. P.(2004) The Cross-Entropy Method a Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning. . New York, NY: Springer New York
RuKr08
Rubinstein, R. Y., & Kroese, D. P.(2008) Simulation and the monte carlo method. (2nd ed.). Hoboken, N.J: John Wiley & Sons
RuRV14
Rubinstein, R. Y., Ridder, A., & Vaisman, R. (2014) Fast sequential Monte Carlo methods for counting and optimization. . Hoboken, New Jersey: Wiley