In probabilistic programming we construct a posterior distribution represented by a log density function. Often the distribution is analyzed through either drawing samples using Markov Chain Monte Carlo (MCMC) or by fitting a distribution to it using variational inference and then studying that variational approximation.
Probabilistic models often have computational problems or make poor assumptions, which sometimes manifest in poor MCMC sampling performance. By analyzing the MCMC draws, one may be able to identify the model problems and improve them. However, problematic models are precisely the models that often are slow to sample with MCMC, so much time is lost waiting for the MCMC warm-up phase to end in order to get those draws.
When you have computational problems, often thereās a problem with your model.
Pathfinder is a variational method for approximating a posterior distribution that is often much faster than MCMC warm-up. It can be used to get initial draws to diagnose problems with a model, to find an initial point from which to initialize MCMC sampling, or to even replace the warm-up phase of Hamiltonian Monte Carlo (HMC).
We implemented Pathfinder.jl, a Julia implementation of Pathfinder that can be used with any Julia probabilistic programming language. It integrates especially well with AdvancedHMC.jl, DynamicHMC.jl, and Turing.jl. By making extensive use of Julia interface packages, it also facilitates further research into variants of Pathfinder.
References
- Lu Zhang, Bob Carpenter, Andrew Gelman, Aki Vehtari (2021). Pathfinder: Parallel quasi-Newton variational inference. arXiv: 2108.03782 [stat.ML]. Code
- https://github.com/mlcolab/Pathfinder.jl