Syllabus and Course Schedule

[Previous offerings: Autumn 2018, Spring 2019]


* Below is a collection of topics, of which we plan to cover a large subset this quarter. The specific topics and the order is subject to change.
CategoryTopic
Review
  • Linear Algebra
  • Matrix Calculus
  • Probability and Statistics
Supervised Learning
  • Linear Regression (Gradient Descent, Normal Equations)
  • Weighted Linear Regression (LWR)
  • Logistic Regression, Perceptron
  • Newton's Method, KL-divergence, (cross-)Entropy, Natural Gradient
  • Exponential Family and Generalized Linear Models
  • Generative Models (Gaussian Discriminant Analysis, Naive Bayes)
  • Kernel Method (SVM, Gaussian Processes)
  • Tree Ensembles (Decision trees, Random Forests, Boosting and Gradient Boosting)
Learning Theory
  • Regularization
  • Bias-Variance Decomposition and Tradeoff
  • Concentration Inequalities
  • Generalization and Uniform Convergence
  • VC-dimension
Deep Learning
  • Neural Networks
  • Backpropagation
  • Deep Architectures
Unsupervised Learning
  • K-means
  • Gaussian Mixture Model (GMM)
  • Expectation Maximization (EM)
  • Variational Auto-encoder (VAE)
  • Factor Analysis
  • Principal Components Analysis (PCA)
  • Independent Components Analysis (ICA)
Reinforcement Learning (RL)
  • Markov Decision Processes (MDP)
  • Bellmans Equations
  • Value Iteration and Policy Iteration
  • Value Function Approximation
  • Q-Learning
Application
  • Advice on structuring an ML project
  • Evaluation Metrics
This table will be updated regularly through the quarter to reflect what was actually covered, along with corresponding readings and notes.
EventDateDescriptionMaterials and Assignments
Lecture 1 6/24
  • Introduction and Logistics
  • Review of Linear Algebra
Class Notes
  • Introduction [pptx]
  • Linear Algebra (section 1-3) [pdf]
Lecture 2 6/26
  • Review of Matrix Calculus
  • Review of Probability
Class Notes
  • Linear Algebra (section 4) [pdf]
  • Probability Theory [pdf]
  • Probability Theory Slides [pdf]
Lecture 3 6/28
  • Review of Probability and Statistics
  • Setting of Supervised Learning
Class Notes
  • Supervised Learning [pdf]
  • Probability Theory [pdf]
Lecture 4 7/1
  • Linear Regression
  • Gradient Descent (GD), Stochastic Gradient Descent (SGD)
  • Normal Equations
  • Probabilistic Interpretation
  • Maximum Likelihood Estimation (MLE)
Class Notes
  • Supervised Learning (section 1-3) [pdf]
Lecture 5 7/3
  • Perceptron
  • Logistic Regression
  • Newton's Method
Class Notes
  • Supervised Learning (section 5-7) [pdf]
Lecture 6 7/5
  • Exponential Family
  • Generalized Linear Models (GLM)
Class Notes
  • Supervised Learning (section 8-9) [pdf]
Lecture 7 7/8
  • Gaussian Discriminant Analysis (GDA)
  • Naive Bayes
  • Laplace Smoothing
Class Notes
  • Generative Algorithms [pdf]
Lecture 8 7/10
  • Kernel Methods
  • Support Vector Machine
Class Notes
  • Kernel Methods and SVM [pdf]
Lecture 9 7/12
  • Gaussian Processes
Class Notes
  • Gaussian Processes [pdf]
Optional
  • The Multivariate Gaussian Distribution [pdf]
  • More on Gaussian Distribution [pdf]
Lecture 10 7/15
  • Neural Networks and Deep Learning
Class Notes
  • Deep Learning (skip Sec 3.3) [pdf]
Optional
  • Backpropagation [pdf]
Lecture 11 7/17
  • Deep Learning (contd)
Lecture 12 7/19
  • Bias and Variance
  • Regularization, Bayesian Interpretation
  • Model Selection
Class Notes
  • Regularization and Model Selection [pdf]
Lecture 13 7/22
  • Bias-Variance tradeoff (wrap-up)
  • Uniform Convergence
Class Notes
  • Bias Variance Analysis [pdf]
  • Statistical Learning Theory [pdf]
Lecture 14 7/24
  • Reinforcement Learning (RL)
  • Markov Decision Processes (MDP)
  • Value and Policy Iterations
Class Notes
  • Reinforcement Learning and Control (Sec 1-2) [pdf]
Lecture 15 7/26
  • RL (wrap-up)
  • Learning MDP model
  • Continuous States
Class Notes
  • Reinforcement Learning and Control (Sec 3-4) [pdf]
Lecture 16 7/29 Unsupervised Learning
  • K-means clustering
  • Mixture of Gaussians (GMM)
  • Expectation Maximization (EM)
Class Notes
  • K-means [pdf]
  • Mixture of Gaussians [pdf]
  • Expectation Maximization (Sec 1-2, skip 2.1) [pdf]
Lecture 17 7/31
  • EM (wrap-up)
  • Factor Analysis
Class Notes
  • Expectation Maximization (Sec 3) [pdf]
  • Factor Analysis [pdf]
Lecture 18 8/2
  • Factor Analysis (wrap-up)
  • Principal Components Analysis (PCA)
  • Independent Components Analysis (ICA)
Class Notes
  • Principal Components Analysis [pdf]
  • Independent Components Analysis [pdf]
Lecture 19 8/5
  • Maximum Entropy and Exponential Family
  • KL-Divergence
  • Calibration and Proper Scoring Rules
Class Notes
  • Maximum Entropy [pdf]
Lecture 20 8/7
  • Variational Inference
  • EM Variants
  • Variational Autoencoder
Class Notes
  • VAE (Sec 4) [pdf]
Lecture 21 8/9
  • Evaluation Metrics
Class Notes
  • Evaluation Metrics [pptx]
Lecture 22 8/12
  • Practical advice and tips
  • Review for Finals
Class Notes
Lecture 23 8/14
  • Review for Finals
Class Notes
Final 8/16
Other Resources
  1. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  2. Previous projects: A list of last year's final projects can be found here.
  3. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NeurIPS (all old NeurIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  4. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.
  5. Machine learning study guides tailored to CS 229 by Afshine Amidi and Shervine Amidi.