| Week | Monday | Wednesday | Reading |
| week 1: Jan 7th | Introduction Review of Probability: random variables, conditional and joint probabilities, Bayes rule, law of total probability, chain rule and factorization. Different interpretations of probability: frequentist and Bayesian views. | Multivariate Probability Models Working with sets of random variables. The multivariate Gaussian model. Independence, conditional independence, and graphical models. | Note Sets 1 and 2 Chapters 1 and 2.1 through 2.6 in text |
| week 2: Jan 14th | Learning
from Data Concepts of models and parameters. Definition of the likelihood function and the principle of maximum likelihood parameter estimation. | Maximum
Likelihood Learning I How to use maximum likelihood methods to learn the parameters of Gaussian models, binomial, multivariate and other parametric models. | Note Set 3 Chapter 10 in text, sections 10.1 through 10.3 |
| week 3: Jan 21st | No Class University Holiday (Martin Luther King Day) | Maximum Likelihood Learning II | Chapter 4 in text, section 4.1 Optional Reading: Tutorial paper on maximum likelihood estimation |
| week 4: Jan 28th | Bayesian
Learning: I General principles of Bayesian estimation: prior densities, posterior densities, MAP, fully Bayesian approaches. Beta/binomial | Bayesian Learning: II Bayesian estimation (ctd): estimation of Gaussian parameters. | Chapter 3.3/3.4 on Beta/Binomial and Dirichlet/Multinomial Chapter 5.2 on posterior distributions (Optional) Chapter 4.6 on estimating parameters of multivariate Gaussians |
| week 5: Feb 4th | Bayesian Learning:
III Predictive densities, model selection, model averaging | Midterm exam (in class) | Chapter 3.2 on predictive densities and Chapter 5.3 onBayesian model selection. |
| week 6: Feb 11th | Classification I | Classification II | Chapter 3 in text, section 3.5 Chapter 4 in text, section 4.2 Optional Reading: naive Bayes models in spam email filtering |
| week 7: Feb 18th | No Class University Holiday (President's Day) | Classification III Class-conditional modeling. Likelihood-based approaches and properties of objective functions. Logistic regression and neural network models. | Barber text: pages 376 to 382 on logistic regression Murphy text: pages 246 to 271 Notes on logistic regression from Charles Elkan Logistic regression for high-dimensional text data |
| week 8: Feb 25th | Mixture
Models and EM I K-means clustering. Mixtures of Gaussians and the associated EM algorithm. Clustering applications. Mixtures of conditional indepedence models. | Mixture Models/EM II | Note Set 4 Chapter 11 in text, sections 11.1 through 11.4 Optional Reading: |
| week 9: Mar 4th | Regression
Modeling I: Linear models. Normal equations. Systematic and stochastic components. | Regression
Modeling II: | Chapter 7 in text, sections 7.1 through 7.3 and section 7.6 Optional Reading: |
| week 10: Mar 11th | State-space Models: hidden Markov and linear Gaussian models | Monte Carlo Methods: Importance sampling. Gibbs sampling and Markov Chain Monte Carlo (MCMC). Sequential sampling | Chapter 17 in text, sections 17.2 through 17.5 Chapter 23 and Chapter 24.1 and 24.2 |
| finals week |