Project Information

Course Reference

    Lecture
    Days: Tuesday/Thursday
    Time: 11:00 am - 12:20 pm
    Room: ICS 259

    Discussion
    Days: Thursday
    Time: 1:00 pm - 1:50 pm
    Room: DBH 1423

    Instructor: Rina Dechter
    Office hours: TBA

Course Description

One of the main challenges in building intelligent systems is the ability to reason under uncertainty, and one of the most successful approaches for dealing with this challenge is based on the framework of Bayesian networks, also called graphical models. Intelligent systems based on Bayesian networks are being used in a variety of real-world applications including diagnosis, sensor fusion, on-line help systems, credit assessment, bioinformatics and data mining.

The objective of this class is to provide an in-depth exposition of knowledge representation and reasoning under uncertainty using the framework of Bayesian networks.  Both theoretical underpinnings and practical considerations will be covered, with a special emphasis on dependency and independency models, on construction Bayesian graphical models and on exact and approximate probabilistic reasoning algorithms. Additional topics include: causal networks, learning Bayesian network parameters from data and dynamic Bayesian networks.

Prerequisites

  • Familiarity with basic concepts of probability theory.
  • Knowledge of basic computer science, algorithms and programming principles.
  • Previous exposure to AI is desirable but not essential.

Course material

The course will be based mostly on three sources:

Additional sources:

A longer list including secondary references.

Some links to software and tools.

Syllabus

Week       Date Topic Readings           Files
Week 1 4/2 (a) Pearl 1-2
(b) Darwiche 1-3
(c) Russell-Norvig 13
(d) Darwiche. Bayesian Networks
Homework 1
Slides 1
  4/4
  • Markov networks: undirected graphical Models.
  • Lecture 2

Slides 2
Week 2 4/9
  • Bayesian networks: directed graphical models.
  • Lecture 3

Homework 2
Slides 3
  4/11
  • Bayesian networks: directed graphical models of independence.
Pearl Ch.3

Week 3 4/16
  • Building Bayesian networks.
Darwiche Ch. 5 Slides 4
  4/18
  • Building Bayesian networks.


Week 4 4/23
  • Exact inference by variable elimination.
  • Lecture 7
Class Notes 1-6 Homework 3
Slides 5

4/25

Week 5 4/30
  • Exact inference by Tree-decompositions: join-tree/junction-tree algorithm. Cluster tree elimination.
  • Lecture 9

Slides 6

5/2
  • Exact inference by tree-decomposition, cutset-conditioning scheme.

Homework 4
Week 6 5/7 Class Notes 7-9 Slides 7

5/9

Week 7 5/14
  • Approximate algorithms by Sampling: MCMC schemes.
  • Lecture 12

Homework 5
Slides 8

5/16
  • Approximate algorithms by Sampling: advanced schemes.

Slides 9
Week 8 5/21
  • Approximate algorithms by Bounded inference: mini-bucket and generalized belief propagation.
  • Lecture 13
Class Notes
10-11
Homework 6
Slides 10a

5/23
  • Approximate algorithms by Bounded Inference (continued); variational methods: mean field, weighted mini-bucket.
  • Lecture 14

Andrew's Slides on Variational Methods
Week 9 5/28
  • Approximate algorithms by Bounded Inference (continued).
  • Lecture 15

Slides 10b

5/30
  • Approximate algorithms by Bounded Inference (continued).
  • Lecture 16


Week 10 6/4
  • Project presentations.



6/6
  • Project presentations.


Week 11 6/11
  • Project presentations.



Assignments:

There will be homework assignments and students will also be engaged in projects.

Grading Policy:

Homework and exam (75%), class project (25%)