Syllabus and Course Schedule

Time and Location: Monday, Wednesday 4:30-5:50pm, Bishop Auditorium
Class Videos: Current quarter's class videos are available here for SCPD students and here for non-SCPD students.
* We may update the course materiels. Please check for the latest version before lectures.


EventDateDescriptionMaterials and Assignments
Lecture 1 4/1 Introduction and Basic Concepts Class Notes: Introduction [pdf]
A0 4/3 Problem Set 0 [pdf] [solution]. Out 4/1. Due 4/10. Submission instructions.
Lecture 2 4/3 Supervised Learning Setup. Linear Regression. Class Notes
  • Supervised Learning, Discriminative Algorithms [pdf]
  • Dataset Loading and Visualization [pdf] [ipynb]
  • Gradient Descent Visualization [pdf] [ipynb]
Section 4/5 Discussion Section: Linear Algebra [Notes]
Lecture 3 4/8 Weighted Least Squares. Logistic Regression. Netwon's Method
Perceptron. Exponential Family. Generalized Linear Models.
Class Notes
  • Perceptron [ipynb]
  • Maximum Entropy and Exponential Families [pdf]
Lecture 4 4/10
A1 4/10 Problem Set 1 [zip]. Out 4/10. Due 4/24. Submission instructions.
Section 4/12 Discussion Section: Probability [Notes][Slides]
Lecture 5 4/15 Gaussian Discriminant Analysis Class Notes
  • Generative Algorithms [pdf]
  • Kernel Methods and SVM [pdf]
Lecture 6 4/17 Naive Bayes. Laplace Smoothing. Kernel Methods.
Section 4/19 Discussion Section: Python [slides]
Lecture 7 4/22 SVM. Kernels. Class Notes
Lecture 8 4/24 Neural Network. Class Notes
  • Deep learning [pdf]
  • Backpropagation [pdf]
A2 4/24 Problem Set 2 [zip]. Out 4/24. Due 5/8. Submission instructions.
Section 4/26 Discussion Section: Learning Theory [pdf]
Project 4/26 Project proposal due at 11:59pm.
Lecture 9 4/29 Neural Network. Class Notes
Lecture 10 5/1 Bias/ Variance. Regularization. Feature/ Model selection.
Class Notes
  • Bias/variance tradeoff and error analysis [pdf]
  • Additional notes on bias/variance [pdf]
  • Regularization and Model Selection [ps] [pdf]
Section 5/3 Discussion Section: Evaluation Metrics [Slides]
Lecture 11 5/6 Practical Advice for ML projects Class Notes
  • Advice on applying machine learning [pdf]
Lecture 12 5/8 K-means. Mixture of Gaussians. Expectation Maximization. Class Notes
  • Unsupervised Learning, k-means clustering [pdf]
  • Mixture of Gaussians [pdf]
  • EM and VAE [pdf]
  • Reading: K-means++ [pdf]
A3 5/8 Problem Set 3 [zip]. Out 5/8. Due 5/22. Submission instructions.
Section 5/10 Discussion Section: Midterm Review [pdf]
Lecture 13 5/13 GMM(EM). Variational Autoencoders. Class Notes
  • EM and VAE [pdf]
Lecture 14 5/15 Principal Component Analysis. Independent Component Analysis. Class Notes
  • Principal Components Analysis [pdf]
  • Independent Components Analysis [pdf]
Midterm 5/15 We will have an in-class midterm from 7pm to 10pm. Logistics. SCPD Logistics. Practice Midterm.
Lecture 15 5/20 MDPs. Bellman Equations. Value iteration and policy iteration Class Notes
  • Reinforcement Learning [pdf]
Lecture 16 5/22 Value function approximation. Class Notes
A4 5/22 Problem Set 4 [zip]. Out 5/22. Due 6/5. Submission instructions.
Section 5/24 Discussion Section: Convolutional Neural Nets [pdf]
Project 5/24 Project milestones due 5/24 at 11:59pm.
Lecture 18 5/29 Policy search. REINFORCE. Class Notes
  • Policy Gradient [pdf]
Section 5/31 Discussion Section: Gaussian Processes [pdf]
Lecture 19 6/3 Other settings of RL, Imitation learning, Adversarial machine learning Class Notes
  • Adversarial Machine Learning [pdf] [ppt]
  • Other settings of RL, Imitation Learning [pdf] [ppt]
Lecture 20 6/5 Course Review and Wrap up Class Notes
Project 6/11 Project poster PDF and project recording (remote SCPD only) due at 11:59 pm Submission instructions.
Project 6/12 Poster presentations from 3:30-6:30pm. Venue and details to be announced.
Project 6/12 Final writeup due at 6:30pm (no late days).
Section Notes
  1. Linear Algebra Review and Reference [pdf]
  2. Probability Theory Review [pdf]
  3. Convex Optimization Overview, Part I [ps] [pdf]
  4. Convex Optimization Overview, Part II [ps] [pdf]
  5. Hidden Markov Models [ps] [pdf]
  6. The Multivariate Gaussian Distribution [pdf]
  7. More on Gaussian Distribution [pdf]
  8. Gaussian Processes [pdf]
Other Resources
  1. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  2. Previous projects: A list of last year's final projects can be found here.
  3. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NeurIPS (all old NeurIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  4. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.
  5. Machine learning study guides tailored to CS 229 by Afshine Amidi and Shervine Amidi.
Supplementary Notes
  1. Binary classification with +/-1 labels [pdf]
  2. Boosting algorithms and weak learning [pdf]
  3. Functional after implementing stump_booster.m in PS2. [here]
  4. The representer theorem [pdf]
  5. Hoeffding's inequality [pdf]