Syllabus and Course Schedule

Note: This is being updated for Summer 2020. The dates are subject to change as we figure out deadlines. Please check back soon.


DateEventDescriptionMaterials and Assignments
6/22 Lecture 0
  • Introduction and Logistics
Class Notes
  • Introduction [pptx]
6/22 Assignment Problem Set 0. Due 6/29 at 11:59pm.
Week 1 Lecture 1
  • Review of Linear Algebra
Class Notes
  • Linear Algebra (section 1-3) [pdf]
  • Additional Linear Algebra Note [pdf]
Lecture 2
  • Review of Matrix Calculus
  • Review of Probability
Class Notes
  • Linear Algebra (section 4) [pdf]
  • Probability Theory [pdf]
  • Probability Theory Slides [pdf]
Lecture 3
  • Review of Probability and Statistics
Class Notes
  • Probability Theory [pdf]
6/29 Assignment Problem Set 1. Due 7/13 at 11:59pm.
Week 2 Lecture 4
  • Linear Regression
  • Gradient Descent (GD), Stochastic Gradient Descent (SGD)
  • Normal Equations
  • Probabilistic Interpretation
  • Maximum Likelihood Estimation (MLE)
Class Notes
  • Supervised Learning (section 1-3) [pdf]
Lecture 5
  • Perceptron
  • Logistic Regression
  • Newton's Method
Class Notes
  • Supervised Learning (section 5-7) [pdf]
Lecture 6
  • Exponential Family
  • Generalized Linear Models (GLM)
Class Notes
  • Supervised Learning (section 8-9) [pdf]
Week 3 Lecture 7
  • Gaussian Discriminant Analysis (GDA)
  • Naive Bayes
  • Laplace Smoothing
Class Notes
  • Generative Algorithms [pdf]
Lecture 8
  • Kernel Methods
  • Support Vector Machine
Class Notes
  • Kernel Methods and SVM [pdf]
Lecture 9
  • Gaussian Processes
Class Notes
  • Gaussian Processes [pdf]
Optional
  • The Multivariate Gaussian Distribution [pdf]
  • More on Gaussian Distribution [pdf]
Week 4 Lecture 10
  • Neural Networks and Deep Learning
Class Notes
  • Deep Learning (skip Sec 3.3) [pdf]
Optional
  • Backpropagation [pdf]
Lecture 11
  • Deep Learning (cont'd)
Lecture 12
  • Bias and Variance
  • Regularization, Bayesian Interpretation
  • Model Selection
Class Notes
  • Regularization and Model Selection [pdf]
Lecture 13
  • Bias-Variance tradeoff (wrap-up)
  • Uniform Convergence
Class Notes
  • Bias Variance Analysis [pdf]
  • Statistical Learning Theory [pdf]
7/13 Assignment Problem Set 2. Due 7/27 at 11:59pm.
Week 5
Lecture 14
  • Reinforcement Learning (RL)
  • Markov Decision Processes (MDP)
  • Value and Policy Iterations
Class Notes
  • Reinforcement Learning and Control (Sec 1-2) [pdf]
Lecture 15
  • RL (wrap-up)
  • Learning MDP model
  • Continuous States
Class Notes
  • Reinforcement Learning and Control (Sec 3-4) [pdf]
Week 6
Lecture 16
  • K-means clustering
  • Mixture of Gaussians (GMM)
  • Expectation Maximization (EM)
Class Notes
  • K-means [pdf]
  • Mixture of Gaussians [pdf]
  • Expectation Maximization (Sec 1-2, skip 2.1) [pdf]
Lecture 17
  • EM (wrap-up)
  • Factor Analysis
Class Notes
  • Expectation Maximization (Sec 3) [pdf]
  • Factor Analysis [pdf]
Lecture 18
  • Factor Analysis (wrap-up)
  • Principal Components Analysis (PCA)
  • Independent Components Analysis (ICA)
Class Notes
  • Principal Components Analysis [pdf]
  • Independent Components Analysis [pdf]
Week 7
Lecture 19
  • Maximum Entropy and Exponential Family
  • KL-Divergence
  • Calibration and Proper Scoring Rules
Class Notes
  • Maximum Entropy [pdf]
Lecture 20
  • Variational Inference
  • EM Variants
  • Variational Autoencoder
Class Notes
  • VAE (Sec 4) [pdf]
Lecture 21
  • Evaluation Metrics
Class Notes
  • Evaluation Metrics [pptx]
7/13 Assignment Problem Set 3. Due 8/10 at 11:59pm.
Week 8
Lecture 22
  • Practical advice and tips
  • Review for Finals
Class Notes
Lecture 23
  • Review for Finals
Class Notes
Other Resources
  1. All lecture videos can be accessed through Canvas.
  2. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  3. Previous projects: A list of last year's final projects can be found here.
  4. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NeurIPS (all old NeurIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  5. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.
  6. Machine learning study guides tailored to CS 229 by Afshine Amidi and Shervine Amidi.