CS 274A: Syllabus, Winter 2017
Note: this syllabus may be updated during the quarter
- Week 1: January 9th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability,
chain;rule and factorization. Sets of random variables, the
multivariate Gaussian model. Conditional independence and graphical
models.
- Week 2: January 16th
- No lecture on Monday (university holiday)
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood parameter estimation.
- Week 3: January 23rd
- Maximum Likelihood Learning: Using maximum likelihood methods to learn the
parameters of Gaussian models, binomial, multivariate and other parametric models.
- Bayesian Learning: Frequentist and Bayesian views of probability.
General principles of Bayesian estimation: prior densities, posterior densities, MAP, fully Bayesian approaches.
- Week 4: January 30th
- Bayesian Learning: Dirichlet/multinomial and Gaussian examples. Predictive densities, model selection, model averaging.
- Sequence Models: Learning with Markov models.
- Week 5: February 6th
- Regression Learning: Linear models. Probabilistic perspectives on regression.
Loss functions. Parameter estimation methods for regression.
- Midterm Exam during Wednesday's class (in-class, closed-book)
- Week 6: February 13th
- Bias-Variance Trade-offs: The bias-variance trade-off for squared error and regression.
- Predictive modeling: Optimization algorithms, focusing on gradient
and stochastic gradient methods. Classification models and logistic regression.
- Week 7: February 20th
- No lecture on Monday (university holiday)
- Classification Learning: Likelihood-based approaches and properties of objective functions.
Links between logistic regression and neural network models.
- Week 8: February 28th
- Classification Learning: Bayes rule, classification boundaries, discriminant functions,
optimal decisions, Bayes error rate, Gaussian classifiers.
- Temporal Models: Autoregressive models, recurrent neural networks.
- Week 9: March 6th
- Mixture Models and EM: Mixture models. Examples of mixture models for binary
and real-valued data.
The EM algorithm for learning Gaussian mixtures.
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for non-vector data.
- Week 10: March 13th
- Monday: Additional topics in unsupervised learning
- Wednesday: no lecture
- Finals Week:
- Final exam, in class, Wednesday March 22nd, 10:30am to 12:30pm