Foundations of Machine Learning
Enseignant responsable :
- FRANCIS BACH
Description du contenu de l'enseignement :
The course will introduce the theoretical foundations of machine learning, review the most successful algorithms with their theoretical guarantees, and discuss their application in real world problems. The covered topics are:
-
Part 1: Supervised Learning Theory: the batch setting
- Intro
- Surrogate Losses
- Uniform Convergence and PAC Learning
- Empirical Risk Minimization and ill-posed problems
- Concentration Inequalities
- Universal consistency, PAC Learnability
- VC dimension
- Rademacher complexity
- Non Uniform Learning and Model Selection
- biais-variance tradeoff
- Structural Minimization Principle and Minimum Description Length Principle
- Regularization
-
Part 2: Supervised Learning Theory and Algorithms in the Online Setting
- Foundations of Online Learning
- Beyond the Perceptron algorithm
-
Partie 3: Ensemble Methods and Kernels Methods
- SVMs, Kernels
- Kernel approximation algorithms in the primal
- Ensemble methods: bagging, boosting, gradient boosting, random forests
-
Partie 4: Algorithms for Unsupervised Learning
- Dimensionality reduction: PCA, ICA, Kernel PCA, ISOMAP, LLE
- Representation Learning
- Expectation Maximization, Latent models and Variational methods
Pré-requis recommandés :
- Linear models
Pré-requis obligatoires :
- Linear Algebra - Statistics and Probability
Compétence à acquérir :
The aim of this course is to provide the students with the fundamental concepts and tools for developing and analyzing machine learning algorithms.
Mode de contrôle des connaissances :
- Each student will have to have the role of scribe during one lecture, taking notes during the class and sending the notes to the teacher in pdf. - Final exam
Bibliographie, lectures recommandées
The most important book: - Shalev-Shwartz, S., & Ben-David, S. (2014). Understanding machine learning: From theory to algorithms. Cambridge university press. Also: - Mohri, M., Rostamizadeh, A., & Talwalkar, A. (2012). Foundations of machine learning. MIT press. - Vapnik, V. (2013). The nature of statistical learning theory. Springer science & business media. - Bishop Ch. (2006). Pattern recognition and machine learning. Springer - Friedman, J., Hastie, T., & Tibshirani, R. (2001). The elements of statistical learning (Vol. 1, No. 10). New York, NY, USA:: Springer series in statistics. - James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning (Vol. 112). New York: springer.