Syllabus and Course Schedule

Time and Location: Monday, Wednesday 4:30-5:50pm, Bishop Auditorium
Class Videos: Current quarter's class videos are available here for SCPD students and here for non-SCPD students.


EventDateDescriptionMaterials and Assignments
Lecture 1 9/24 Introduction and Basic Concepts
A0 9/24 Problem Set 0 [pdf]. Out 9/24. Due 10/3. Submission instructions.
Lecture 2 9/26 Supervised Learning Setup. Linear Regression. Class Notes
  • Supervised Learning, Discriminative Algorithms [ps] [pdf]
Section 9/28 Discussion Section: Linear Algebra [Notes]
Lecture 3 10/1 Weighted Least Squares. Logistic Regression. Netwon's Method
Perceptron. Exponential Family. Generalized Linear Models.
Class Notes
  • Generative Algorithms [ps] [pdf]
Lecture 4 10/3
A1 10/3 Problem Set 1 [zip]. Out 10/3. Due 10/17. Submission instructions.
Section 10/5 Discussion Section: Probability[Notes][Slides]
Lecture 5 10/8 Gaussian Discriminant Analysis. Naive Bayes.
Lecture 6 10/10 Laplace Smoothing. Support Vector Machines.
Class Notes
  • Support Vector Machines [ps] [pdf]
Section 10/12 Discussion Section: Python [slides]
Lecture 7 10/15 Support Vector Machines. Kernels.
Lecture 8 10/17 Bias-Variance tradeoff. Regularization and model/feature selection. Class Notes
  • Bias/variance tradeoff and error analysis[pdf]
  • Regularization and Model Selection [ps] [pdf]
  • Advice on applying machine learning[pdf]
A2 10/17 Problem Set 2 [zip]. Out 10/17. Due 10/31. Submission instructions.
Section 10/19 Discussion Section: Learning Theory [ps] [pdf]
Project 10/19 Project proposal due at 11:59pm.
Lecture 9 10/22 Tree Ensembles. Class Notes
  • Decision trees [pdf]
  • Ensembling methods [pdf]
Lecture 10 10/24 Neural Networks: Basics
Class Notes
  • Online Learning and the Perceptron Algorithm. (optional reading) [ps] [pdf]
  • Deep learning [pdf]
  • Backpropagation [pdf]
Lecture 11 10/29 Neural Networks: Training
Section 10/26 Discussion Section: Evaluation Metrics [Slides]
Lecture 12 10/31 Practical Advice for ML projects Class Notes
  • Unsupervised Learning, k-means clustering. [ps] [pdf]
  • Mixture of Gaussians [ps] [pdf]
  • The EM Algorithm [ps] [pdf]
  • Factor Analysis [ps] [pdf]
  • Principal Components Analysis [ps] [pdf]
  • Independent Components Analysis [ps] [pdf]
Lecture 13 11/5 K-means. Mixture of Gaussians. Expectation Maximization.
Lecture 14 11/7 Factor Analysis.
Lecture 15 11/12 Principal Component Analysis. Independent Component Analysis.
Lecture 16 11/14 MDPs. Bellman Equations.
Section 11/2 Discussion Section: Midterm Review [pdf]
A3 10/31 Problem Set 3 [zip]. Out 10/31. Due 11/14. Submission instructions.
Midterm 11/7 We will have a take-home midterm. All details are posted on Piazza.
Section 11/16 Discussion Section: canceled
Project 11/16 Project milestones due 11/16 at 11:59pm.
Lecture 17 11/26 Value Iteration and Policy Iteration. LQR. LQG. Class Notes
  • Reinforcement Learning and Control [ps] [pdf]
  • LQR, DDP and LQG [pdf]
Lecture 18 11/28 Q-Learning. Value function approximation.
Lecture 19 12/3 Policy Search. REINFORCE. POMDPs.
Lecture 20 12/5 Optional topic. Wrap-up.
A4 11/14 Problem Set 4 [zip]. Out 11/14. Due 12/5. Submission instructions.
Section 11/30 Discussion Section: On critiques of Machine Learning [slides]
Section 12/07 Discussion Section: Convolutional Neural Networks
Project 12/10 Project poster PDF and project recording (some teams) due at 11:59 pm Submission instructions.
Project 12/11 Poster presentations from 8:30-11:30am. Venue and details to be announced.
Project 12/13 Final writeup due at 11:59pm (no late days).
Supplementary Notes
  1. Binary classification with +/-1 labels [pdf]
  2. Boosting algorithms and weak learning [pdf]
  3. Functional after implementing stump_booster.m in PS2. [here]
  4. The representer theorem [pdf]
  5. Hoeffding's inequality [pdf]
Section Notes
  1. Linear Algebra Review and Reference [pdf]
  2. Probability Theory Review [pdf]
  3. Convex Optimization Overview, Part I [ps] [pdf]
  4. Convex Optimization Overview, Part II [ps] [pdf]
  5. Hidden Markov Models [ps] [pdf]
  6. The Multivariate Gaussian Distribution [pdf]
  7. More on Gaussian Distribution [pdf]
  8. Gaussian Processes [pdf]
Other Resources
  1. Advice on applying machine learning: Slides from Andrew's lecture on getting machine learning algorithms to work in practice can be found here.
  2. Previous projects: A list of last year's final projects can be found here.
  3. Data: Here is the UCI Machine learning repository, which contains a large collection of standard datasets for testing learning algorithms. If you want to see examples of recent work in machine learning, start by taking a look at the conferences NIPS(all old NIPS papers are online) and ICML. Some other related conferences include UAI, AAAI, IJCAI.
  4. Viewing PostScript and PDF files: Depending on the computer you are using, you may be able to download a PostScript viewer or PDF viewer for it if you don't already have one.
  5. Machine learning study guides tailored to CS 229 by Afshine Amidi and Shervine Amidi.