The Living Thing / Notebooks :

Bayesian stats for beginners

*ahem* DJ, where’s the Bayes line?

Bayesian statistics is controversial amongst frequentists, sometimes for purely terminological reasons, and sometimes for profound philosophical ones.

I’m going to ignore that, because sometimes it is practical to use Bayesian statistics. Even for frequentists it is sometimes refreshing to move your effort from deriving frequentist estimators for intractable models, to just using the damn Bayesian ones, which fail in different and interesting ways than you are used to.

Anyway, you can avoid learning a lot of tedious frequentist machinery by starting with a prior belief that your model isn’t too pathological for a Bayesian MCMC sampler and proceeding accordingly. (You might, of course, need to prove some horrible Bayesian MCMC convergence results instead) If it works and you are feeling fancy you might then justify your method on frequentist grounds, but then you are wiping out one interesting source of after-dinner argument.

Prior choice

Is weird and important. Here are some argumentative and disputed rules of thumb.

Doing it

Stan

Stan is the inference toolbox for, especially, hierarchical models.

Getting started

R intro: Chris Fonnesbeck’s workshop in R

Intro to Stan for econometrics

Edward

Hot new option from Blei’s lab, leverages trendy deep learning machinery, tensorflow for variational Bayes.

Pyro

pytorch + bayes = pyro, an edward competitor.

pyro launch announcment:

We believe the critical ideas to solve AI will come from a joint effort among a worldwide community of people pursuing diverse approaches. By open sourcing Pyro, we hope to encourage the scientific world to collaborate on making AI tools more flexible, open, and easy-to-use. We expect the current (alpha!) version of Pyro will be of most interest to probabilistic modelers who want to leverage large data sets and deep networks, PyTorch users who want easy-to-use Bayesian computation, and data scientists ready to explore the ragged edge of new technology.

Turing.jl

Turing.jl

Turing.jl is a Julia library for (universal) probabilistic programming. Current features include:

PyMC3

Pymc3 is pure python, which means you don’t need C++ to fix things like you do in stan. It’s presumably generally slower than stan if you actually do real MC simulations, but I haven’t checked.

Chris Fonnesbeck’s example in python

Church/Anglican

Level up you esoterism with Church, a general-purpose Turing-complete Monte Carlo lisp-derivative, which is slow as a thousand miles of baby-arse but does some reputedly cute tricks with modeling human problem-solving, and other likelihood-free methods, according to creators Noah Goodman and Joshua Tenenbaum.

See also anglican, which is the same but different, being built in clojure, and hence also leveraging browser Clojurescript.

WebPPL

WebPPL is a successor to Church designed as a teaching language for probabilistic reasoning in the browser. Hip. If you like Javascript ML.