[an error occurred while processing this directive]
CompSci-276 Fall 2009, Belief Networks
[ main | homework | software | references ]


New: info about projects
 

Course Reference

  • Days: Monday/Wednesday
  • Time: 11:00 p.m. - 12:20 p.m.
  • Room: DBH 1423
  • Instructor: Rina Dechter
  • Office hours: Wed. 2-3pm, DBH 4232

Course Description

One of the main challenges in building intelligent systems is the ability to reason under uncertainty, and one of the most successful approaches for dealing with this challenge is based on the framework of Bayesian networks, also called graphical models. Intelligent systems based on Bayesian networks are being used in a variety of real-world applications including diagnosis, sensor fusion, on-line help systems, credit assessment, bioinformatics and data mining.

The objective of this class is to provide an in-depth exposition of knowledge representation and reasoning under uncertainty using the framework of Bayesian networks.  Both theoretical underpinnings and practical considerations will be covered, with a special emphasis on dependency and independency models, on construction Bayesian graphical models and on exact and approximate probabilistic reasoning algorithms. Additional topics include: causal networks, learning Bayesian network parameters from data and dynamic Bayesian networks.

Prerequisites

  • Familiarity with basic concepts of probability theory.
  • Knowledge of basic computer science, algorithms and programming principles.
  • Previous exposure to AI is desirable but not essential.

Course material

The course will be based mostly on three sources:

Additional sources:

A longer list including secondary references.

Some links to software and tools.

Tentative Syllabus

Week     Date Topic Readings           Files
Week 1 9/28
  • Introduction to Bayesian networks.
(a) Pearl 1-2
(b) Darwiche 1-3
(c) Russell-Norvig 13
Homework 1
Class Slides 1
  9/30
  • Probabilistic networks representation: Independence properties.
(a) Pearl 3
(b) Darwiche 4

Week 2 10/5
  • Markov networks: Undirected graphical models of independence.
(a) Pearl 3
(b) Darwiche 4
Homework 2
Slides 2 (final)
  10/7
  • Bayesian networks: Directed graphical models of independence.


Week 3 10/12
  • Bayesian networks: Directed graphical models (continued)

Slides 3 (updated)
  10/14
  • Building Bayesian networks
(a) Darwiche 5

Week 4 10/19
  • Exact inference: Variable elimination
(a) Notes 1-5
(b) Darwiche 6
Homework 3

10/21
  • Optimization queries: MPE and MAP

Class Slides 4
Week 5 10/26
  • Tree-decompositions: bucket trees, join-trees and polytrees. Cluster tree elimination and propagation algorithms.
(a) Notes Class Slides 5 (updated 11/3)

10/28
  • Search and Inference: The loop-cutset and w-cutset schemes,
(a) Notes 5-6
(b) Darwiche 7
(c) Pearl 4
Homework 4 (updated 11/4)

Week 6 11/2
  • Bounded inference: mini-bucket, mini-clustering, belief propagation schemes.
(a) Notes 7
(b) Darwiche 14
(c) Pearl 4
Class Slides 6

11/4
  • Bounded Inference continued



Week 7 11/9
  • Approximate reasoning by sampling: MCMC methods (Gibbs sampling), importance sampling.
  • Custet conditioning sampling.
(a) Notes
(b) Pearl 4
(c) Darwiche 15
Homework 5
Class Slides 7

11/11
  • No class. Veteran's day holiday


Week 8 11/16
  • Sampling continued
  • Representation: Local structures. Causal independance, context-specific and determinism
(a) Notes (Cutset paper)
(b) Darwiche 15
Homework 6 (updated 11/18)

11/18
  • Advanced Search and Inference: AND/OR Search
(a) Notes (AND/OR paper) Class Slides 8
Week 9 11/23
  • Extended representations: Mixed probabilisic and deterministic networks. Hybrid discrete and continuous networks. Dynamic Bayesian networks. First-order probabilistic languages.
(a) Notes
Class Slides 9

11/25
  • Learning / Causality (TBD)



Week 10 11/30
  • Project presentations.



12/2
  • Project presentations.



Assignments:

There will be homework assignments and students will also be engaged in projects.

Grading Policy:

Homework and exam (75%), class project (25%)