Bayesian statistics is controversial amongst frequentists, sometimes for purely terminological reasons, and sometimes for profound philosophical ones.
I’m going to ignore that, because sometimes it is practical to use Bayesian statistics. Even for frequentists it is sometimes refreshing to move your effort from deriving frequentist estimators for intractable models, to just using the damn Bayesian ones, which might just work.
Anyway, you can avoid learning a lot of tedious frequentist machinery by starting with a prior belief that your model isn’t too pathological for a Bayesian MCMC sampler and proceeding accordingly. (You might, of course, need to prove some horrible Bayesian MCMC convergence results instead) If it works and you are feeling fancy you might then justify your method on frequentist grounds, but then you are wiping out one interesting source of after-dinner argument.
Stan is the inference toolbox for, especially, hierarchical models.
We believe the critical ideas to solve AI will come from a joint effort among a worldwide community of people pursuing diverse approaches. By open sourcing Pyro, we hope to encourage the scientific world to collaborate on making AI tools more flexible, open, and easy-to-use. We expect the current (alpha!) version of Pyro will be of most interest to probabilistic modelers who want to leverage large data sets and deep networks, PyTorch users who want easy-to-use Bayesian computation, and data scientists ready to explore the ragged edge of new technology.
Turing.jl is a Julia library for (universal) probabilistic programming. Current features include:
- Universal probabilistic programming with an intuitive modelling interface
- Hamiltonian Monte Carlo (HMC) sampling for differentiable posterior distributions
- Particle MCMC sampling for complex posterior distributions involving discrete variables and stochastic control flows
- Gibbs sampling that combines particle MCMC and HMC
Pymc3 is pure python, which means you don’t need C++ to fix things like you do in stan. It’s presumably generally slower than stan if you actually do real MC simulations, but I haven’t checked.
Level up you esoterism with Church, a general-purpose Turing-complete Monte Carlo lisp-derivative, which is slow as a thousand miles of baby-arse but does some reputedly cute tricks with modeling human problem-solving, according to creators Noah Goodman and Joshua Tenenbaum.
See also anglican, which is the same but different.
What is this? Something by MacKay, Langford, Shawe-Taylor and Seeger, connection to the frequentist PAC-learning paradigm.