The Living Thing / Notebooks :

Bayesian sparsity

What if you like the flavours of both Bayesian inference and the implicit model selection of sparse inference? Can you cook Bayesian-Frequentist fusion cuisine with this novelty ingredient?

Laplace Prior

Laplace priors on linear regression coefficients, includes normal lasso as a MAP estimate. Pro: very easy to derive frequentist LASSO as a MAP estimate from this prior. Con: Not actually sparse for non-MAP uses. I have no need for this right now, but I did enjoy Dan Simpson’s critique.

Spike-and-slab prior

TBD.

Horseshoe prior

Stan guy, Michael Betancourt introduces some issues with LASSO-type inference for Bayesians with a slant towards Horseshoe-type priors in preference spike and slab, possibly because hierarchical mixtures like spike-and-slab are not that great in Stan, albeit possible.

Refs