## Machine
Learning – Fall 2006

## ICS: 273A

Instructor: Max Welling

##

## Prerequisites

## ICS 270A Intro AI, or with consent of
instructor.

##

## Goals:

## The goal of this class is to
familiarize you with various stat-of-the-art machine learning techniques for

classification, regression, clustering and
dimensionality reduction. Besides this, an important aspect

this class is to provide a modern statistical view of machine learning.

## Through projects you will learn
to do independent research on some real world datasets.

##

## Homework : Please see green boxes on the slides.

## Projects: Submit a one page description with detailed info about what you like to do,

or default to one of these.

##

## Syllabus:

## 1: introduction: overview, examples, goals, algorithm evaluation, statistics, kNN, logistic regression. [ slides lec1] [slides lec2]

## 2: classification I: decision trees, random forests, bagging, boosting,. [slides lec3,4]

## 3: clustering
& dimensionality reduction: k-means, expectation-maximization, PCA. [slides lec5,6]

## 4: neural networks: perceptron, multi-layer networks, back-propagation. [slides lec7,8]

## 5: reinforcement
learning: MDPs, TD- and Q-learning, value iteration. [slides lec9,10]

## 6: Bayesian
methods: conditional independence, generative models, naive Bayes classifier. [slides lec11,12 ]

## 7: classification II: kernel methods & support vector machines. [slides lec13,14]

required reading on SVM [classnotes SVM].

Additional background reading [classnotes convex optimization]

## In the last week we will do do class presentations of your projects. Please prepare 10 mins talks

##

## Syllabus:

## The course will primarily be lecture-based
with homework and

exams. Most homework will revolve around the implementation of various

classification algorithms on the SciTech dataset provided above.

It is required that you use MATLAB for this coding work.

##

## Grading Criteria

Grading will be based on a combination of weekly homework (10%) , projects (35%), some midterm (20% ) and a final exam (35%) .

##

## Textbook

The textbook that will be used for this course is:

## 1. Tom Mitchell: Machine
Learning. *(http://www.cs.cmu.edu/~tom/mlbook.html)*

Optional side readings are:

## 2. D. MacKay: Information
Theory, Inference and Learning Algorithms

3. R.O. Duda, P.E. Hart, D. Stork: Pattern
Classification

4. C.M. Bishop: Neural Networks for Pattern Recognition

5. T. Hastie, R. Tibshirani,
J.H, Friedman: The Elements of Statistical Learning

6. B.D. Ripley: Pattern Recognition and Neural Networks