Applied Bayesian Nonparametrics

Tutorial Course at CVPR 2012

Sunday, June 17 from 8:30am-12:30pm, Rhode Island Convention Center

Course Information
See the materials.
Instructor
Erik Sudderth, lastname-at-cs-dot-brown-dot-edu. Assistant Professor of Computer Science, Brown University.
Related Courses
Brown CSCI 2950-P: Applied Bayesian Nonparametrics, Fall 2011.

Overview

Bayesian nonparametric (BNP) models define distributions on infinite-dimensional spaces of functions, partitions, or other combinatorial structures. They lead to flexible, data-driven unsupervised learning algorithms, and models whose internal structure continually grows and adapts to new observations. Applied to computer vision problems, BNP methods have lead to segmentation algorithms which adapt their resolution to each image, learning algorithms which discover objects and activities from videos, and low-level vision systems which adapt their local appearance dictionaries to each image. More generally, BNP models provide a practical alternative to the model selection difficulties which arise with traditional unsupervised learning algorithms.

This tutorial surveys state-of-the-art approaches to Bayesian nonparametrics, from its foundations in stochastic processes to the practical tools needed for large-scale computation. Our focus is on those BNP models which have proven most useful in practice, including models which allow "infinite" cluster or feature-based data representations, and extensions which capture temporal or spatial dependencies. We discuss learning algorithms based on variational and Monte Carlo approximations, and ground our presentation in applied examples of modeling image and video data.

In this tutorial, we aim to make the big ideas underlying BNP methods accessible to the entire computer vision community. However, an introductory course in statistical machine learning will be helpful in understanding some concepts. Revelant foundational material includes parametric Bayesian methods for prediction and parameter estimation; clustering via probabilistic mixture models; the expectation maximization (EM) algorithm; and Markov chain Monte Carlo (MCMC) methods, particularly the Gibbs sampler.

Instructor

Erik Sudderth

Erik B. Sudderth is an Assistant Professor in the Brown University Department of Computer Science. He received the Bachelor's degree (summa cum laude) in Electrical Engineering from the University of California, San Diego, and the Master's and Ph.D. in EECS from the Massachusetts Institute of Technology. His research interests include probabilistic graphical models; nonparametric Bayesian methods; and applications of statistical machine learning in computer vision, signal processing, and artificial intelligence. He was awarded a National Defense Science and Engineering Graduate Fellowship (1999), an Intel Foundation Doctoral Fellowship (2004), and in 2008 was named one of "AI's 10 to Watch" by IEEE Intelligent Systems Magazine.

Prof. Sudderth is the author of over 30 refereed publications involving probabilistic graphical models or nonparametric Bayesian methods, with a particular focus on applications in computer vision. He has developed Bayesian nonparametric models for a number of vision problems, including image denoising, object and scene recognition, image segmentation, optical flow estimation, and time series analysis. In the Fall of 2011, he taught a graduate seminar in applied Bayesian nonparametrics. In September of 2012, he is co-organizing an ICERM Workshop on Bayesian Nonparametrics.