Gradients and message-passing

Cleaving reality at the joint, then summing it at the marginal

November 25, 2014 — January 12, 2023

algebra
approximation
Bayes
distributed
dynamical systems
generative
graphical models
machine learning
networks
optimization
probability
signal processing
state space models
statistics
stochastic processes
swarm
time series
Figure 1

Bayes-by-backprop meets variational message-passing meets the chain rule.

1 Automatic differentiation as message-passing

This is a a well-known informal bit of lore in the field, but apparently not well-documented?

The first reference I can find is Eaton (2022), which is amazingly late.

TBC

2 Stochastic variational message passing

Akbayrak (2023):

Stochastic approximation methods for variational inference have recently gained popularity in the probabilistic programming community since these methods are amenable to automation and allow online, scalable, and universal approximate Bayesian inference. Unfortunately, common Probabilistic Programming Libraries (PPLs) with stochastic approximation engines lack the efficiency of message passingbased inference algorithms with deterministic update rules such as Belief Propagation (BP) and Variational Message Passing (VMP). Still, Stochastic Variational Inference (SVI) and Conjugate-Computation Variational Inference (CVI) provide principled methods to integrate fast deterministic inference techniques with broadly applicable stochastic approximate inference. Unfortunately, implementation of SVI and CVI necessitates manually driven variational update rules, which do not yet exist in most PPLs. In this chapter, for the exponential family of distributions, we cast SVI and CVI explicitly in a message passing-based inference context. We also demonstrate how to go beyond exponential family of distributions by using raw stochastic gradient descent for the minimization of the free energy. We provide an implementation for SVI and CVI in ForneyLab, which is an automated message passing-based probabilistic programming package in the open source Julia language. Through a number of experiments, we demonstrate how SVI and CVI extends the automated inference capabilities of message passing-based probabilistic programming.

3 References

Akbayrak. 2023. Towards Universal Probabilistic Programming with Message Passing on Factor Graphs.”
Akbayrak, Şenöz, Sarı, et al. 2022. Probabilistic Programming with Stochastic Variational Message Passing.” International Journal of Approximate Reasoning.
Akbayrak, and Vries. 2019. Reparameterization Gradient Message Passing.” In 2019 27th European Signal Processing Conference (EUSIPCO).
Dauwels. 2007. On Variational Message Passing on Factor Graphs.” In 2007 IEEE International Symposium on Information Theory.
Dehaene. 2016. Expectation Propagation Performs a Smoothed Gradient Descent.” arXiv:1612.05053 [Stat].
Eaton. 2022. Belief Propagation Generalizes Backpropagation.”
Huang, and Jojic. 2010. Maximum-Likelihood Learning of Cumulative Distribution Functions on Graphs.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.
Liao, Liu, Wang, et al. 2019. Differentiable Programming Tensor Networks.” Physical Review X.
Lucibello, Pittorino, Perugini, et al. 2022. Deep Learning via Message Passing Algorithms Based on Belief Propagation.” Machine Learning: Science and Technology.