COURSE AIMS AND OBJECTIVES:
Students will become familiar with:
- the problem of supervised/unsupervised learning;
- basic machine learning algorithms and learn how to solve them using numerical optimization;
- the theoretical background of these algorithms using the framework of statistical learning theory.
COURSE DESCRIPTION AND SYLLABUS:
We will cover topics which might include:
1. Notions: bias-variance trade off, generalization error, model selection, variable selection, cross-validation, bootstrap, regularization, optimization for machine learning.
2. Supervised learning methods: linear models, penalized linear models, basic classifications methods such as logistic regression, LDA/QDA and naive Bayes, local methods, SVM, neural networks.
3. Unsupervised learning methods: principal component analysis, clustering methods.
|
-
The Elements of Statistical Learning: Data Mining, Inference and Prediction, T. Hastie, R. Tibshirani, and J. Friedman, Springer, 2009.
-
Deep learning, I. Goodfellow, Y. Bengio, A. Courville, MIT press, 2016.
-
Learning theory from first principles, F. Bach, MIT press, 2024.
-
All of Statistics: A Concise Course in Statistical Inference, L. Wasserman, Springer, 2004.
-
Mathematics for Machine Learning, M. P. Deisenroth, A. A. Faisal, C. S. Ong, Cambridge University Press, 2020.
-
Pattern Recognition and Machine Learning, C. Bishop, Springer, 2007.
-
Machine Learning: a Probabilistic Perspective, K. Murphy, MIT, 2012.
|