Skip to content
forked from oberonl/ml_se

Machine Learning Course for Bachelor Students of Software Engineering

Notifications You must be signed in to change notification settings

evgenii-egorov/ml_se

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contacts

Telegram chat

Lecturers: Anna Kuzina; Evgenii Egorov

Class Teachers and TAs

Class Teachers Contact Group TA (contact)
Maria Tikhonova tg: @mashkka_t БПИ184 Alexandra Kogan (tg: @horror_in_black)
Maksim Karpov tg: @buntar29 БПИ181, БПИ182 Kirill Bykov (tg: @darkydash), Victor Grishanin (tg: @vgrishanin)
Polina Polinuna tg: @ppolunina БПИ185 Michail Kim (tg: @kimihailv)
Vadim Kokhtev tg: @despairazure БПИ183 Daniil Kosakin (tg: @nieto95)

Use this form to send feedback to the course team anytime

Recomended Literature

[PR] Christopher M. Bishop. 2006. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg.
Link

[ESL] Hastie, T., Hastie, T., Tibshirani, R., & Friedman, J. H. (2001). The elements of statistical learning: Data mining, inference, and prediction. New York: Springer.
Link

[FML] Mohri, M., Talwalkar, A., & Rostamizadeh, A. Second Edition, (2018). Foundations of Machine Learning. Cambridge, MA: The MIT Press.
Link

Class materials

Lectures

Lecture Recordings

Date Topic Lecture materials Reading
30 jan 1.Introduction Slides [FML] Ch 1; [ESL] Ch 2.1-2
6 feb 2.Gradient Optimization Slides [FML] Appx A, B; Convex Optimization book
13 feb 3.Linear Regression Slides, Notebook [PR] Ch 3.1; [ESL] Ch 3.1-4; [FML] Ch 4.4-6
20 feb 4.Linear Classification Slides (GLM), Notes (GLM) , Slides (linclass) [PR] Ch 4.1; [ESL] Ch 4.1-2, 4.4; [FML] Ch 13
27 feb 5.Logistic Regression and SVM Slides [ESL] Ch 12.1-3; [FML] Ch 5, 6
6 mar 6.Decision Trees Slides [ESL] Ch 9.2
12 mar 7.Bagging, Random Forest Slides, Notebook [PR] Ch 3.2 (bias-variance) [ESL] Ch 8 [FML] Ch 7
19 mar 8.Gradient boosting Slides [PR] Ch 14.3 [ESL] Ch 10
22 mar - 4 apr NO LECTURES --- ---
9 apr 9.Clustering and Anomaly Detection Slides, Notebook [PR] Ch 9.1; [ESL] Ch 13.2, 14.3
16 apr 10.EM and PCA Lecture notes [PR] Ch 9.2-9.4; [ESL] Ch 8.5
23 apr 11.Bayesian Linear Regression Slides [PR] Ch 2.3-2.4, 3.3-3.5
30 apr 12.GP for regression and classification tasks [PR] Ch 6.4
14 may 13.MLP and DNN for Classification Slides [PR] Ch 5.1-5.5; [ESL] Ch 11
21 may 14.Deep Generative Models Overview
28 may 15.Summary

Practicals

Date Topic Materials Extra Reading/Practice
25-30 jan 1.Basic toolbox Notebook; Dataset Python Crash Course
1-6 feb 2.EDA and Scikit-learn Notebook
8-13 feb 3.Calculus recap and Gradient Descent Notebook, pdf The Matrix Cookbook
15-20 feb 4.Linear Regression Notebook
22-27 feb 5.Classification Notebook
1-6 mar 6.Texts and Multiclass classification Notebook, Dataset
8-13 mar 7.Decision Trees Notebook
15-20 mar 8.Ensembles Notebook
5-10 apr 9.Gradient Boosting Notebook
12-17 apr 10.Anomaly detection and Clustering Notebook
19-24 apr 11.EM Tasks
25-30 apr 12.Empirical Bayes and RVM Notebook [PR] Ch 7.2
10-15 may 13.GP Notebook
17-22 may 14.MLP Notebook
24-29 may 15.Summary Slides

Assignments

We'll be using AnyTask for grading: course link

Date Published Task Deadline
6 feb HW 1: Notebook, dataset 20 feb
26 feb HW 2: Notebook 13 mar
14 mar HW 3: Notebook 4 apr
10 apr HW 4: Notebook, dataset 1 may
3 may HW 5: Notebook, datasets 24 may
31 may HW 6 Task: pdf, Task: tex 10 june

Grading

Final grade = 0.7*HW + 0.3*Exam

  • HW - Average grade for the assignments 1 to 5. You can get extra points by solving HW 6, but no more than 10 in total.
  • Exam - Grade for the exam

You can skip the exam if your average grade for the first 5 assignemnts is not smaller than 6 (HW >=6). In this case:

Final grade = HW

About

Machine Learning Course for Bachelor Students of Software Engineering

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 91.2%
  • HTML 8.8%