CS 274A: Syllabus and Schedule, Winter 2024
Note: dates and topics may change slightly during the quarter, but the overall syllabus should remain largely the same.
- Week 1: January 8th
- Probability Review: random variables, conditional and
joint probabilities, Bayes rule, law of total probability, factorization. Sets of random variables, the
multivariate Gaussian model. Conditional independence and graphical models.
- Week 2: January 15th
- No lecture on Monday (university holiday)
- Learning from Data: Concepts of models and parameters. Definition of the
likelihood function and the principle of maximum likelihood.
- Week 3: January 22nd
- Maximum Likelihood Learning: Maximum likelihood for Gaussian models, binomial, multivariate and other parametric models.
- Sequence Models: Learning from sequential data. Markov models and related approaches. Connections with language models.
- Week 4: January 29th
- Bayesian Learning:
General principles of Bayesian estimation: prior densities, posterior densities, Beta-binomial examples.
- Bayesian Learning: Comparing point estimates (ML, MAP, MPE) and fully Bayesian approaches. Bayesian analysis of multinomial models and Markov models. Bayesian approaches to multi-arm bandits (in homework).
- Week 5: Feb 5th
- Bayesian Learning: Bayesian analysis of Gaussian models. Predictive densities. Bayesian model selection.
- Bayesian Learning: Predictive densities for Gaussian models. Approximate Bayesian inference: Laplace, variational, and Monte Carlo methods.
- Week 6: February 12th
- Midterm Exam during Monday's class
- Regression Learning: Linear and non-linear (e.g., neural network) models. Probabilistic perspectives on regression. Loss functions. Parameter estimation methods for regression.
- Week 7: February 19th
- No lecture on Monday (university holiday)
- Regression Learning: Bayesian approaches to regression. The bias-variance trade-off for squared error and regression.
- Week 8: February 26th
- Classification Learning: Likelihood-based approaches and properties of objective functions. Connections between regression and classification. Logistic regression and neural network classifiers.
- Classification Learning: Decision boundaries, discriminant functions,
optimal decisions, Bayes error rate.
- Week 9: March 4th
- Mixture Models and EM: Finite mixture models.
The EM algorithm for learning Gaussian mixtures.
- Mixture Models and EM: Properties of the EM algorithm. Relation of K-means clustering to
Gaussian mixture modeling. Mixtures for discrete and non-vector data.
- Week 10: March 11th
Monday: Latent Variable Models: Bayesian learning approaches. MCMC methods.
- Wednesday: Temporal Models: Autoregressive models, recurrent neural networks.
Finals Week: March 18th
- Final exam, Wed March 20th, 10:30am