**PENN CIS 620, FALL 2008: COMPUTATIONAL LEARNING THEORY
**

Prof. Michael Kearns
and
Dr. Koby Crammer

mkearns@cis.upenn.edu,
crammer@cis.upenn.edu

Tuesdays
12-3 PM (First meeting Sep 9)

315 Levine Hall (note change of room)

URL for this page:
www.cis.upenn.edu/~mkearns/teaching/COLT

**COURSE DESCRIPTION**

This course is an introduction to Computational Learning Theory, a field which attempts to provide algorithmic, complexity-theoretic and statistical foundations to modern machine learning.

The first part of the course will closely follow portions of An Introduction to Computational Learning Theory, by M. Kearns and U. Vazirani (MIT Press). We will cover perhaps 6 or 7 of the chapters in K&V over (approximately) the first half of the course, often supplementing with additional readings and materials. Copies of K&V will be available at the Penn bookstore. The second portion of the course will focus on a number of more recent but still fundamental topics in learning theory not covered in K&V.

The course will give a broad overview of the kinds of problems and techniques typically studied in Computational Learning Theory, and provide a basic arsenal of powerful mathematical tools for analyzing machine learning problems.

Topics to be covered include:

Notes on related courses:

**COURSE FORMAT, REQUIREMENTS, AND PREREQUISITES **

Much of the course will be in fairly traditional "chalk talk" lecture format, but with ample opportunity for discussion, participation, and critique. The course will meet once a week on Mondays from 12 to 3. Lunch will be served.

There are no official prerequisites, but participants should be comfortable with the basics of the design and analysis of algorithms, computational complexity, statistics and probability theory. The level of the material is such that the typical graduate student in computer science or statistics would have an appropriate background; strong undergraduates are also encouraged to join. Auditors and occasional participants are welcome.

The course requirements for registered students are active in-class participation, an occasional problem set, possibly leading a class discussion, and a final project.

READING: K&V Chapters 1 and 2.

Tue Sep 23

[MK] Occam's Razor, continued; VC dimension.

READING: K&V Chapters 2 and 3.

Tue Sep 30

[MK] VC dimension, continued; Structural Risk Minimization; generalizations.

Tue Oct 7

[KC] Better generalization bounds: Rademacher complexity, PAC-Bayes.

Tue Oct 14

Fall break, no class.

Tue Oct 21

[MK] Learning with noise and errors; SQ learning.

READING: K&V Chapter 5; supplementary reading:

HOMEWORK #2, DUE AS HARDCOPY IN CLASS NOV 7: Solve the following problems from K&V: 3.1, 3.5, 5.1, 5.3. You are free to collaborate on the homework assignments, but please turn in separate write-ups and acknowledge your collaborations. Please do not attempt to look up solutions in the literature.

Tue Oct 28

[KC] Boosting

READING: K&V Chapter 4; supplementary readings.

CHANGE OF DATE: Tue Nov 4 ===> Fri Nov 7

[MK] In honor of Election Day, this week's class has been MOVED
from Tuesday to Friday, Nov 7 from 12-2PM in Room 337 Moore (we only have
the room for two hours). We will cover a complementary pair of results,
first showing that finite state automata cannot be learned in the PAC
model (under standard cryptographic assumptions), then showing that they
can be learned if the model is augmented with membership queries.
HW2 will be due in class Nov 7.

READING: K&V Chapters 6,7,8.

Tue Nov 11

[KC] Online and no-regret learning.

READING: Supplementary readings.

INFORMATION ON COURSE PROJECTS.

For those of you registered for credit in the course,
your final assignment will be a course project, for which you have three choices:

The deadline for final projects will be December 20, via email to Prof Kearns.

Tue Nov 18

[KC] Online and no-regret learning, continued.

READING: Supplementary readings.

Tue Nov 25

[MK] COLT methods in reinforcement learning.

READING: Supplementary readings.

Tue Dec 2

[MK] Topic TBD.

READING: Supplementary readings.