STA 414/2104 Winter 2024:
Statistical Methods for Machine Learning II
This course introduces probabilistic learning tools such as
exponential families, directed graphical models,
Markov random fields,
exact inference techniques,
message passing,
sampling and MCMC,
hidden Markov models,
variational inference,
EM algorithm,
Bayesian regression,
probabilistic PCA,
Neural networks
kernel methods,
and Gaussian processes.
It will also offer a broad view of model-building and optimization techniques that are based on probabilistic building blocks which will serve as a foundation for more advanced machine learning courses.
More details can be found in syllabus and piazza.
Announcements:
- Hw 3 solutions are published.
- The practice final with answers is now available.
- The practice final is now available.
- Both lectures on Mar 25/26 are moved online, I am sorry! I will share more information soon. Hopefully Quercus will be back working.
- TA office hours for HW3 are on 3/18 and 3/19, both 11am-noon in UY 9040.
- Hw 3 is out, and due on 3/24 23:59.
- Hw 2 solutions are published.
- Final exam is in-person on 18 Apr, 7pm-10pm (sic!) at BR 200.
- Midterm practice solutions are released.
- A representative practice midterm is released (solutions will follow in a couple of days).
- Hw 1 solutions are published.
- Hw 2 deadline extended to Feb 25, 23:59.
- Hw 2 is out, and due on 2/18 23:59. TA office hours are on 2/13 1-2pm and
on 2/16 11am-12pm, both at Sidney Smith, room 621.
- Hw 1 is out, and due on 2/04 23:59. TA office hours are on 1/31 3-4pm, 2/02 11am-12pm, both at Sidney Smith, rooms 621/621A.
- Lectures begin on Jan 8/9!
Instructors:
Prof |
Piotr Zwiernik |
Email |
piotr.zwiernik@utoronto.ca |
Office hours |
Tuesday 15:30 -17:30 (UY 9040) |
Teaching Assistants:
Ichiro Hashimoto, Kevin Zhang, Junhao Zhu
They will handle all questions related to homework assigments.
- Email: sta414.2104@course.utoronto.ca (in the subject of the email indicate the scope: HW1, HW2, general, etc)
Time & Location:
Section |
Room |
Lecture time |
STA 414 LEC0101 & STA 2104 LEC0101 |
PB B250 |
M 14-17 |
STA 414 LEC5101 & STA 2104 LEC5101 |
MS 2170 |
T 18-21 |
Suggested Reading
No required textbooks. Suggested reading will be posted after each lecture (See lectures below).
Lectures and timeline
Week |
Lectures |
Suggested reading |
Tutorials |
Video |
Timeline |
1 |
Introduction Probabilistic Models |
PML1 1.1-1.3 PML1 3.4, 4.2 |
tut w1 |
NA |
syllabus |
2 |
Decision theory Directed Graphical Models |
PRML 1.5 PML2 4.2 |
tut w2 moralization |
rec w2 |
|
3 |
Markov Random Fields Exact inference |
PML2 2.3, 4.3 PML2 9.5 |
tut w3 |
rec w3 tut w3 |
hw1 out |
4 |
Message passing Monte Carlo Methods |
PML2 9.3, 9.4 PML2 11.1, 11.2, 11.5 |
tut w4 |
rec w4 |
hw1 due |
5 |
Markov Chain Monte Carlo |
PML2 2.6, 12.1-12.6 |
tut w5,demo notebook |
rec w5 |
hw2 out |
6 |
Hidden Markov Models Variational inference I |
PML2 9.2 PML2 5.1, |
HMM colab VI colab |
rec w6 |
hw2 due |
7 |
Reading week (no class/tutorial) |
- |
- |
|
- |
8 |
Midterm exam |
|
- |
|
midterm |
9 |
Variational inference II EM algorithm |
PML2 10.1-10.3 PML2 28.2.1, 6.5.3 |
tut w7 VI for stats |
rec w7 |
|
10 |
Probabilistic PCA Bayesian regression |
PRML 12.2 PRML 3.3 |
tut w8 |
rec w8 |
hw3 out |
11 |
Kernel methods Gaussian processes |
PRML 6.1-3 PRML 6.4 |
GP tutorial tut w9 tut pdf |
rec w9 |
|
12 |
Neural Networks |
PRML 5 notes |
tut w10 tut pdf |
rec w10 |
hw3 due |
13 |
Variational Autoencoders |
PML2 16.3.3, 21 |
VAE colab |
rec w11 |
|
Homeworks
Homework # |
Out |
Due |
TA Office Hours |
Solutions |
Assigment 1 |
1/22 |
2/04 |
1/31 3-4pm, 2/02 11am-12pm, both at Sidney Smith, rooms 621/621A |
solutions |
Assigment 2 |
2/05 |
2/18 |
2/13 1-2pm and on 2/16 11am-12pm, both at Sidney Smith, room 621 |
solutions |
Assigment 3 |
3/04 |
3/24 |
3/18 and 3/19, both 11am-noon in UY 9040 |
solutions |
Computing Resources
For the homework assignments, we will use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:
- The easiest option is run everything on colab.
- Alternatively, you can install everything yourself on your own machine.
- If you don’t already have python, install using Anaconda.
- Use pip to install the required packages
pip install scipy numpy autograd matplotlib jupyter sklearn
- For those unfamiliar with Numpy, there are many good resources, e.g. Numpy tutorial and Numpy Quickstart.