CS 274A: Syllabus, Winter 2018
Note: the schedule may be adapted/updated during the quarter.
- Week 1: January 8th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability,
chain rule and factorization. Sets of random variables, the
multivariate Gaussian model. Conditional independence and graphical
models.
- Week 2: January 15th
- No lecture on Monday (university holiday)
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood parameter estimation.
- Week 3: January 22nd
- Maximum Likelihood Learning: Using maximum likelihood methods to learn the
parameters of Gaussian models, binomial, multivariate and other parametric models.
- Bayesian Learning: Frequentist and Bayesian views of probability.
General principles of Bayesian estimation: prior densities, posterior densities, MAP, fully Bayesian approaches.
- Week 4: January 29th
- Bayesian Learning: Dirichlet/multinomial and Gaussian examples. Predictive densities, model selection, model averaging.
- Sequence Models: Learning from sequential data. Markov models and related approaches.
- Week 5: February 5th
- Regression Learning I: Linear models. Probabilistic perspectives on regression.
Loss functions. Parameter estimation methods for regression.
- Regression Learning II: Optimization algorithms, focusing on gradient
and stochastic gradient methods. Regularization and Bayesian methods.
Connections between regression and classification.
- Week 6: February 12th
- Midterm Exam during Monday's class (in-class, closed-book)
- Bias-Variance Trade-offs: The bias-variance trade-off for squared error and regression.
- Week 7: February 19th
- No lecture on Monday (university holiday)
- Classification Learning: Likelihood-based approaches and properties of objective functions.
Links between logistic regression and neural network models.
- Week 8: February 27th
- Classification Learning: Bayes rule, classification boundaries, discriminant functions,
optimal decisions, Bayes error rate, Gaussian classifiers.
- Temporal Models: Autoregressive models, recurrent neural networks.
- Week 9: March 5th
- Mixture Models and EM: Mixture models. Examples of mixture models for binary
and real-valued data.
The EM algorithm for learning Gaussian mixtures.
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for non-vector data.
- Week 10: March 12th
- Monday: Additional topics in unsupervised learning
- Wednesday: Additional topics in sequential models
- Finals Week:
- Final exam, in class, Friday March 23rd, 8:00am to 10:00am.
The time of exam was selected by the registrar, not by the instructor :)