Syllabus and Course Schedule

[Previous offerings: Fall 2019, Spring 2020]

This table will be updated regularly through the quarter to reflect what was covered, along with corresponding readings and notes.
DateEventDescriptionMaterials and Assignments
6/22 Lecture 0
  • Introduction and Logistics
Class Notes
  • Introduction [pptx]
6/22 Assignment Problem Set 0. [files] Due 6/29 at 11:59pm.
Week 1 Lecture 1
  • Review of Linear Algebra
Class Notes
  • Linear Algebra (section 1-3) [pdf]
  • Additional Linear Algebra Note [pdf]
Lecture 2
  • Review of Matrix Calculus
  • Review of Probability
Class Notes
  • Linear Algebra (section 4) [pdf]
  • Probability Theory [pdf]
  • Probability Theory Slides [pdf]
Lecture 3
  • Review of Probability and Statistics
Class Notes
  • Probability Theory [pdf]
6/29 Assignment Problem Set 1. [files][code] Due 7/13 at 11:59pm.
Week 2 Lecture 4
  • Linear Regression
  • Gradient Descent (GD), Stochastic Gradient Descent (SGD)
  • Normal Equations
  • Probabilistic Interpretation
  • Maximum Likelihood Estimation (MLE)
Class Notes
  • Supervised Learning (section 1-3) [pdf]
Lecture 5
  • Perceptron
  • Logistic Regression
  • Newton's Method
Class Notes
  • Supervised Learning (section 5-7) [pdf]
Lecture 6
  • Exponential Family
  • Generalized Linear Models (GLM)
Class Notes
  • Supervised Learning (section 8-9) [pdf]
Week 3 Lecture 7
  • Gaussian Discriminant Analysis (GDA)
  • Naive Bayes
  • Laplace Smoothing
Class Notes
  • Generative Algorithms [pdf]
Lecture 8
  • Kernel Methods
  • Support Vector Machine
Class Notes
  • Kernel Methods and SVM [pdf]
Lecture 9
  • Gaussian Processes
Class Notes
  • Gaussian Processes [pdf]
Week 4 Lecture 10
  • Neural Networks and Deep Learning
Class Notes
  • Deep Learning (skip Sec 3.3) [pdf]
  • Backpropagation [pdf]
Lecture 11
  • Deep Learning (cont'd)
Lecture 12
  • Bias and Variance
  • Regularization, Bayesian Interpretation
  • Model Selection
Class Notes
  • Regularization and Model Selection [pdf]
Other Resources
  1. All lecture videos can be accessed through Canvas.
  2. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  3. Previous projects: A list of last year's final projects can be found here.
  4. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NeurIPS (all old NeurIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  5. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.
  6. Machine learning study guides tailored to CS 229 by Afshine Amidi and Shervine Amidi.