PROBABILISTIC MODELS FOR TIME SERIES AND SEQUENCES

ICS 280D, SPRING 2000
Time and Location: Mon/Wed, 2 to 3:20, ICF 101
email: smyth@ics.uci.edu

### Project Information:

Instructions for Class Projects

### Course Goals:

In this course students will learn the general principles of how to build and analyse probabilistic models for time series and sequence data. The course will begin with basic concepts such as stationarity, autocorrelation, Markov properties, likelihood, and so forth. From there a variety of models will be explored, beginning with relatively simple autoregressive models (for real-valued time series) and Markov chains (for discrete data), working up to more complex representations such as hidden and semi-Markov models. Basic principles of how to construct, analyze, estimate, and evaluate such models will be emphasized, and applications to a variety of areas such as speech recognition, financial forecasting, protein modeling, fault detection, etc., will be discussed where appropriate.

### Text

For classic time-series models we will use The Analysis of Time Series , C. Chatfield, Chapman and Hall, 1996, 5th edition. This will be supplemented by a collection of research papers, mainly tutorial in nature, for discrete-valued sequences and for the more advanced topics.

### Prerequisites

Ideally the student will have some familiarity with basic concepts in learning from data (e.g., have taken ICS 273 or ICS 278 or some equivalent) as well as having a good basic understanding of probability (e.g., have taken ICS 274 or the equivalent of an upper division or introductory graduate course in probability and statistics). However, the course will be reasonably self-contained so these are not strict pre-requisites. If in doubt please come and discuss with me before signing up.

### Syllabus

 Week Topic Reading April 3, 5 Introduction: Types of time-series and sequences, basic concepts, descriptive techniques, visualization, stationarity, trends, seasonality, autocorrelation Chapters 1 and 2 and 3.1 to 3.3 in the text April 10, 17 Markov Chains: Review of basic probability, independence and conditional independence, definition of likelihood, Markov chains, properties of Markov chains, classification of states, steady-state conditions. Handout (Chapter 4 from Ross) April 17, 19 Graphical Models: Basic principles of directed graphical models, conditional independence properties, general inference algorithms Two chapters from Jordan and Bishop April 24, 26 Probability Models for Time Series: linear autoregressive models, conditional independence properties, stability, forecasting and applications Chapter 3 in Text May 1, 3 Learning and Parameter Estimation: maximum likelihood, parameter estimation, examples for Markov chains and linear models, Bayesian methods Chapter 4 and 5 in text, Handout May 8, 10 Hidden Variable Markov Models: graphical models with hidden variables, hidden Markov models, Kalman filters, inference and estimation, outline of EM algorithm. Applications in speech recognition, protein modeling, and fault detection. Chapter 10 in text, Two chapters from Jordan and Bishop May 15, 17 Clustering Sequences: probabilistic model-based approaches, more on EM, applications to clustering Web data. Cadez et al paper on Web clustering May 22, 24 Semi-Markov and Segmental Markov models: Semi-Markov processes, segmental Markov processes, applications to pattern recognition and change detection. Application paper by Ge and Smyth May 31 Event Data Models: Poisson and related models, modeling bursts, different models for event characterization Chapter from text by Ross June 5, 7 Advanced Topics TBD