Bayesian machine learning
Ects : 4
Enseignant responsable :
Volume horaire : 24Description du contenu de l'enseignement :
Bayesian Nonparametrics:
- Introduction
- The Dirichlet Process
- Infinite Mixture models
- Posterior Sampling
- Models beyond the Dirichlet Process
- Gaussian Processes
- Selected applications
Bayesian Deep Learning
- Why do we want parameter uncertainty
- Priors for Bayesian neural networks
- Posterior inference
- Martingale Posteriors and generalised Bayesian Inference
Pré-requis obligatoires :
- Bayesian statistics
- Markov Chain Monte Carlo
Compétence à acquérir :
Essentials of Bayesian Nonparametrics, main concepts for Bayesian Deep Learning
Mode de contrôle des connaissances :
Final exam and homework
Bibliographie, lectures recommandées
- Hjort NL, Holmes C, Müller P, Walker SG, editors. Bayesian nonparametrics. Cambridge University Press; 2010 Apr 12.
- Ghosal S, Van der Vaart AW. Fundamentals of nonparametric Bayesian inference. Cambridge University Press; 2017 Jun 26.
- Williams CK, Rasmussen CE. Gaussian processes for machine learning. Cambridge, MA: MIT press; 2006.
- Many references at www.gatsby.ucl.ac.uk/~porbanz/npb-tutorial.html
- Murphy KP. Probabilistic machine learning: Advanced topics. MIT press; 2023 Aug 15.
- Fong E, Holmes C, Walker SG. Martingale posterior distributions. Journal of the Royal Statistical Society Series B: Statistical Methodology. 2023 Nov;85(5):1357-91.