Gaussian processes (Fall 2023)
Topic and reading list by week:
- Bayesian and frequentist inference
- No assigned reading
- Weight and function space representations
- Williams and Rasmussen (2006) Chapters 1–2
- Inference with conjugate GPs
- Williams and Rasmussen (2006) Chapter 2
- Standard kernel functions
- Williams and Rasmussen (2006) Chapters 4.1–4.2
- Valid stationary kernels (Bochner’s theorem) Williams and Rasmussen (2006) Chapter 4.2.1
- Lindgren, Rootzén, and Sandsten (2012) Chapter 3.1–3.2
- Vector-valued GPs Alvarez et al. (2012)
- Laplace approximation and MCMC for non-conjugate likelihoods
- Williams and Rasmussen (2006) Chapters 3.1–3.4
- Scalable GPs with inducing points
- Mercer’s Theorem
- Political science application: Guest speaker Avi Feller
- Ben-Michael et al. (2023)
- Oceanography application: Guest speaker Renato Berlingheri
- Berlinghieri et al. (2023)
- Cosmology application: Guest speaker Zarija Lukic
- Takhtaganov and Müller (2018)
Variational inference (Fall 2024)
Topic and reading list by week:
- Bayesian statistics: MCMC vs Laplace approximation
- No assigned reading
- Overview of variational inference
- Blei, Kucukelbir, and McAuliffe (2017)
- Exponential families
- Wainwright and Jordan (2008)
- Exponential families week 2
- Wainwright and Jordan (2008)
- The EM algorithm and CAVI
- Neal and Hinton (1998)
- Stochastic variational inference and natural gradients
- Hoffman et al. (2013)
- Black box variational inference and ADVI
- Kucukelbir et al. (2017)
- Deterministic ADVI and linear response covariances
- Giordano, Ingram, and Broderick (2024)
- Normalizing flows
- Rezende and Mohamed (2015)
- Stein variational gradient descent
- Liu and Wang (2016)
- Importance weighting VI
- Doucet, Moulines, and Thin (2023)
- Variational autoencoders
- Kingma and Welling (2013)
- Classification-based inference
- New work, no reading!
Simulation-based inference (Summer 2024)
Topic and reading list by week:
- Introduction to simulation-based Bayesian inference
- No assigned reading
- Approximate Bayesian computation (ABC)
- Marin et al. (2012)
- Conditional density estimation
- Izbicki and B. Lee (2017)
- Neural ratio estimators (NRE)
- Cranmer, Pavez, and Louppe (2015)
- Neural posterior estimation (NPE)
- Lueckmann et al. (2017)
- NPE continued (BayesFlow)
- Radev et al. (2020)
- Surrogates
- Price et al. (2018)
- Evaluation
- Zhao et al. (2021)
- Bayesian SBI for machine learning: TabPFN
- Müller et al. (2021)
- Review and reflections
- Cranmer, Brehmer, and Louppe (2020)
References
Alvarez, Mauricio A, Lorenzo Rosasco, Neil D Lawrence, et al. 2012. “Kernels for Vector-Valued Functions: A Review.” Foundations and Trends in Machine Learning 4 (3): 195–266.
Ben-Michael, Eli, David Arbour, Avi Feller, Alexander Franks, and Steven Raphael. 2023. “Estimating the Effects of a California Gun Control Program with Multitask Gaussian Processes.” The Annals of Applied Statistics 17 (2): 985–1016.
Berlinghieri, Renato, Brian L Trippe, David R Burt, Ryan Giordano, Kaushik Srinivasan, Tamay Özgökmen, Junfei Xia, and Tamara Broderick. 2023. “Gaussian Processes at the Helm (Holtz) a More Fluid Model for Ocean Currents.” In Proceedings of the 40th International Conference on Machine Learning, 2113–63.
Blei, David M, Alp Kucukelbir, and Jon D McAuliffe. 2017. “Variational Inference: A Review for Statisticians.” Journal of the American Statistical Association 112 (518): 859–77.
Cranmer, Kyle, Johann Brehmer, and Gilles Louppe. 2020. “The Frontier of Simulation-Based Inference.” Proceedings of the National Academy of Sciences 117 (48): 30055–62.
Cranmer, Kyle, Juan Pavez, and Gilles Louppe. 2015. “Approximating Likelihood Ratios with Calibrated Discriminative Classifiers.” arXiv Preprint arXiv:1506.02169.
Doucet, Arnaud, Eric Moulines, and Achille Thin. 2023. “Differentiable Samplers for Deep Latent Variable Models.” Philosophical Transactions of the Royal Society A 381 (2247): 20220147.
Giordano, Ryan, Martin Ingram, and Tamara Broderick. 2024. “Black Box Variational Inference with a Deterministic Objective: Faster, More Accurate, and Even More Black Box.” Journal of Machine Learning Research 25 (18): 1–39.
Hensman, James, Nicolo Fusi, and Neil D Lawrence. 2013. “Gaussian Processes for Big Data.” arXiv Preprint arXiv:1309.6835.
Hensman, James, Alexander Matthews, and Zoubin Ghahramani. 2015. “Scalable Variational Gaussian Process Classification.” In Artificial Intelligence and Statistics, 351–60. PMLR.
Hoffman, Matthew, David Blei, Chong Wang, and John Paisley. 2013. “Stochastic Variational Inference.” Journal of Machine Learning Research 14 (1): 1303–47.
Izbicki, Rafael, and Ann B. Lee. 2017. “Converting High-Dimensional Regression to High-Dimensional Conditional Density Estimation.”
Kingma, Diederik, and Max Welling. 2013. “Auto-Encoding Variational Bayes.” arXiv Preprint arXiv:1312.6114.
Kucukelbir, Alp, Dustin Tran, Rajesh Ranganath, Andrew Gelman, and David M Blei. 2017. “Automatic Differentiation Variational Inference.” Journal of Machine Learning Research 18 (14): 1–45.
Lindgren, Georg, Holger Rootzén, and Maria Sandsten. 2012. Stationary Stochastic Processes. Chapman & Hall/CRC.
Liu, Qiang, and Dilin Wang. 2016. “Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm.” Advances in Neural Information Processing Systems 29.
Lueckmann, Jan-Matthis, Pedro J Goncalves, Giacomo Bassetto, Kaan Öcal, Marcel Nonnenmacher, and Jakob H Macke. 2017. “Flexible Statistical Inference for Mechanistic Models of Neural Dynamics.” Advances in Neural Information Processing Systems 30.
Marin, Jean-Michel, Pierre Pudlo, Christian P Robert, and Robin J Ryder. 2012. “Approximate Bayesian Computational Methods.” Statistics and Computing 22 (6): 1167–80.
Müller, Samuel, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, and Frank Hutter. 2021. “Transformers Can Do Bayesian Inference.” arXiv Preprint arXiv:2112.10510.
Neal, Radford M, and Geoffrey E Hinton. 1998. “A View of the EM Algorithm That Justifies Incremental, Sparse, and Other Variants.” In Learning in Graphical Models, 355–68. Springer.
Price, Leah F, Christopher C Drovandi, Anthony Lee, and David J Nott. 2018. “Bayesian Synthetic Likelihood.” Journal of Computational and Graphical Statistics 27 (1): 1–11.
Radev, Stefan T, Ulf K Mertens, Andreas Voss, Lynton Ardizzone, and Ullrich Köthe. 2020. “BayesFlow: Learning Complex Stochastic Models with Invertible Neural Networks.” IEEE Transactions on Neural Networks and Learning Systems 33 (4): 1452–66.
Rezende, Danilo, and Shakir Mohamed. 2015. “Variational Inference with Normalizing Flows.” In International Conference on Machine Learning, 1530–38. PMLR.
Takhtaganov, Timur, and Juliane Müller. 2018. “Adaptive Gaussian Process Surrogates for Bayesian Inference.” arXiv Preprint arXiv:1809.10784.
Titsias, Michalis. 2009. “Variational Learning of Inducing Variables in Sparse Gaussian Processes.” In Artificial Intelligence and Statistics, 567–74. PMLR.
Wainwright, Martin J. 2019. High-Dimensional Statistics: A Non-Asymptotic Viewpoint. Vol. 48. Cambridge university press.
Wainwright, Martin J, and Michael I Jordan. 2008. “Graphical Models, Exponential Families, and Variational Inference.” Foundations and Trends in Machine Learning 1 (1–2): 1–305.
Williams, Christopher, and Carl Rasmussen. 2006. Gaussian Processes for Machine Learning. Vol. 2. 3. MIT press Cambridge, MA.
Zhao, David, Niccolò Dalmasso, Rafael Izbicki, and Ann B Lee. 2021. “Diagnostics for Conditional Density Models and Bayesian Inference Algorithms.” In Uncertainty in Artificial Intelligence, 1830–40. PMLR.