Syllabus and Course Schedule

Time and Location: Monday, Wednesday 9:30-10:50am, NVIDIA Auditorium
Class Videos: Current quarter's class videos are available here for SCPD students and here for non-SCPD students.


EventDateDescriptionMaterials and Assignments
Introduction (1 class)
Lecture 1 9/25 1. Basic concepts Class Notes
  • Supervised Learning, Discriminative Algorithms [ps] [pdf]
A0 9/25 Problem Set 0 [pdf]. Submission instructions.
Supervised learning (5 classes)
Lecture 2 9/27 1. Supervised learning setup. LMS.
Section 9/29 Discussion Section: Linear Algebra Discussion Section: Linear Algebra [Notes]
Lecture 3 10/2 2. Logistic regression. Perceptron. Exponential family.
Lecture 4 10/4
A1 10/4 Problem Set 1 [pdf]. Out 10/4. Due 10/18. Submission instructions.
Section 10/6 Discussion Section: Probability Discussion Section: Probability[Notes][Slides]
Lecture 5 10/9 3. Generative learning algorithms. Gaussian discriminant analysis. Naive Bayes. Class Notes
  • Generative Algorithms [ps] [pdf]
Lecture 6 10/11 4. Support vector machines. Class Notes
  • Support Vector Machines [ps] [pdf]
Section 10/13 Discussion Section: Vectorization Discussion Section: Vectorization[Slides][kNN][Logistic Regression][Softmax Regression][images][labels]
Practice ML advice (2 classes)
Lecture 7 10/16 1. Bias/variance tradeoff
2. Model selection and feature selection
Class Notes
  • Bias/variance tradeoff and error analysis[pdf]
  • Learning Theory [ps] [pdf]
  • Regularization and Model Selection [ps] [pdf]
  • Online Learning and the Perceptron Algorithm. (optional reading) [ps] [pdf]
  • Advice on applying machine learning[pdf]
Lecture 8 10/18 3. Evaluating and debugging learning algorithms
4. Practical advice on structuring an ML project
A2 10/18 Problem Set 2 [pdf]. Out 10/18. Due 11/1. Submission instructions.
Section 10/20 Discussion Section: Convex Optimization Discussion Section: Convex Optimization
  • Convex Optimization Overview, Part I [ps] [pdf]
  • Convex Optimization Overview, Part II [ps] [pdf]
Project 10/20 Project proposal due at 11:59pm.
Deep Learning (2 classes)
Lecture 9 10/23 1. NN architecture
2. Forward/Back propagation
Class Notes
  • Deep learning [pdf]
  • Backpropagation [pdf]
Lecture 10 10/25 3. Vectorization
4. Other optimization tricks.
Section 10/27 Discussion Section: Evaluation Metrics Discussion Section: Evaluation Metrics [Slides]
Unsupervised learning (5 classes)
Lecture 11 10/30 1. Clustering. K-means.
2. EM. Mixture of Gaussians.
3. Factor analysis.
4. PCA (Principal components analysis).
5. ICA (Independent components analysis).
Class Notes
  • Unsupervised Learning, k-means clustering. [ps] [pdf]
  • Mixture of Gaussians [ps] [pdf]
  • The EM Algorithm [ps] [pdf]
  • Factor Analysis [ps] [pdf]
  • Principal Components Analysis [ps] [pdf]
  • Independent Components Analysis [ps] [pdf]
Problem Set 3 Out 11/1. Due 11/15.
Lecture 12 11/1
Lecture 13 11/6
Lecture 14 11/8
Lecture 15 11/13
Section 11/3 Discussion Section: Midterm-Review Discussion Section: Midterm-Review
A3 11/1 Problem Set 3 [pdf]. Out 11/1. Due 11/15. Submission instructions.
Midterm 11/8 The midterm is open-book/open-notes/open laptop (no internet). It will take place on Wednesday, November 8, 2017 from 6-9 PM. The course staff will announce exam venue and material covered closer to the midterm date.
Section 11/17 Discussion Section: Deep Learning Methods Discussion Section: Deep Learning Methods
Project 11/20 Project milestones due 11/20 at 11:59pm.
Reinforcement learning and control (4 classes)
Lecture 16 11/15 1. MDPs. Bellman equations.
2. Value iteration and policy iteration.
3. Linear quadratic regulation (LQR). LQG.
4. Q-learning. Value function approximation.
Class Notes
  • Reinforcement Learning and Control [ps] [pdf]
  • LQR, DDP and LQG [pdf]
Problem Set 4 Out 11/15. Due 12/6.
Lecture 17 11/27
Lecture 18 11/29
A4 11/15 Problem Set 4 [pdf]. Out 11/15. Due 12/6. Submission instructions.
Section 12/1 Discussion Section: Deep Learning Platform Discussion Section: Deep Learning Platform
Lecture 19 12/4 Generative Adversarial Networks (GANs) Class Notes
  • Generative Adversarial Networks (GANs)[pdf]
Lecture 20 12/6 Adversarial machine learning Class Notes
  • Adversarial examples in ML[pdf]
Project 12/11 Project poster PDF and project recording (some teams) due at 11:59 pm Submission instructions.
Project 12/12 Poster presentations from 8:30-11:30am. Venue and details to be announced.
Project 12/15 Final writeup due at 11:59pm (no late days).
Supplementary Notes
  1. Binary classification with +/-1 labels [pdf]
  2. Boosting algorithms and weak learning [pdf]
  3. Functional after implementing stump_booster.m in PS2. [here]
  4. The representer theorem [pdf]
  5. Hoeffding's inequality [pdf]
Section Notes
  1. Linear Algebra Review and Reference [pdf]
  2. Probability Theory Review [pdf]
  3. Files for the Matlab tutorial: [pdf] [sigmoid.m] [logistic_grad_ascent.m] [matlab_session.m]
  4. Convex Optimization Overview, Part I [ps] [pdf]
  5. Convex Optimization Overview, Part II [ps] [pdf]
  6. Hidden Markov Models [ps] [pdf]
  7. The Multivariate Gaussian Distribution [pdf]
  8. More on Gaussian Distribution [pdf]
  9. Gaussian Processes [pdf]
Other Resources
  1. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  2. Previous projects: A list of last year's final projects can be found here.
  3. Matlab resources: Here are a couple of Matlab tutorials that you might find helpful: http://www.math.ucsd.edu/~bdriver/21d-s99/matlab-primer.html and http://www.math.mtu.edu/~msgocken/intro/node1.html. For emacs users only: If you plan to run Matlab in emacs, here are matlab.el, and a helpful .emac's file.
  4. Octave resources: For a free alternative to Matlab, check out GNU Octave. The official documentation is available here. Some useful tutorials on Octave include http://en.wikibooks.org/wiki/Octave_Programming_Tutorial and http://www-mdp.eng.cam.ac.uk/web/CD/engapps/octave/octavetut.pdf .
  5. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NIPS(all old NIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  6. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.