Dr. Rina Dechter - University of California at Irvine ZOT!
home | publications | book | courses | research Revised on May. 21, 2017

CompSci-276 Spring 2017, Reasoning in Probabilistic Graphical Models: Belief Networks
[ main | software | references ]


Project Information

Course Reference

    Lecture
    Days: Monday/Wednesday
    Time: 11:00 am - 12:20 pm
    Room: DBH 1427

    Instructor: Rina Dechter
    Office hours: Monday 3:00 pm - 4:00 pm

Course Description

One of the main challenges in building intelligent systems is the ability to reason under uncertainty, and one of the most successful approaches for dealing with this challenge is based on the framework of Bayesian networks, also called graphical models. Intelligent systems based on Bayesian networks are being used in a variety of real-world applications including diagnosis, sensor fusion, on-line help systems, credit assessment, bioinformatics and data mining.

The objective of this class is to provide an in-depth exposition of knowledge representation and reasoning under uncertainty using the framework of Bayesian networks.  Both theoretical underpinnings and practical considerations will be covered, with a special emphasis on dependency and independency models, on construction Bayesian graphical models and on exact and approximate probabilistic reasoning algorithms. Additional topics include: causal networks, learning Bayesian network parameters from data and dynamic Bayesian networks.

Prerequisites

  • Familiarity with basic concepts of probability theory.
  • Knowledge of basic computer science, algorithms and programming principles.
  • Previous exposure to AI is desirable but not essential.

Syllabus

Week       Date Topic Readings           Files        
Week 1 4/3 (a) Pearl 1-2
(b) Darwiche 1-3
(c) Russell-Norvig 13
(d) Darwiche.
Bayesian Networks
Homework 1
Slides 1
  4/5
  • Markov networks: undirected graphical Models.
  • Lecture 2
Slides 2
Week 2 4/10
  • Bayesian networks: directed graphical models.
Homework 2
  4/12
  • Bayesian networks: directed graphical models of independence.
  • Lecture 4
Pearl Ch.3 Slides 3
Week 3 4/17 Darwiche Ch. 5 Slides 4
  4/19
  • Building Bayesian networks.

Homework 3
Mixtures of Trees
Week 4 4/24
  • Exact inference by variable elimination.
  • Lecture 7
Dechter Ch. 4,
Darwiche Ch. 6
Slides 5

4/26
  • Local structures CPTs and Induced width algorithms

Slides 5b
Homework 4
Week 5 5/1
  • Exact inference by Tree-decompositions:
    Join-tree/Junction-tree algorithm. Cluster tree elimination.
  • Lecture 9
Dechter Ch. 5,
Darwiche Ch. 7-8
Slides 6

5/3
  • Exact inference by tree-decomposition, cutset-conditioning scheme.
  • Lecture 10

Week 6 5/8 Homework 5

5/10 Slides 7
Week 7 5/15
  • Approximate algorithms by Bounded Inference.
  • Lecture 13


5/17
  • Approximate algorithms by Bounded Inference (continued).
Class Notes Ch.8
Homework 6
Slides 8
Week 8 5/22
  • Approximate algorithms by Sampling: MCMC schemes.

5/24
  • Approximate algorithms by Sampling: advanced schemes.

Week 9 5/29
  • Approximate algorithms by Bounded Inference (continued).

5/31
  • Approximate algorithms by Bounded Inference (continued).

Week 10 6/5
  • Project presentations.



6/7
  • Project presentations.


Week 11 6/12
  • Project presentations.



Assignments:

There will be homework assignments and students will also be engaged in projects.

Grading Policy:

Homework and exam (75%), class project (25%)