PENN CIS 625, SPRING 2017: COMPUTATIONAL LEARNING THEORY (aka Theory of Machine Learning)

Prof. Michael Kearns
mkearns@cis.upenn.edu

Time: Mondays 12-3 PM
Location: 307 Levine

IMPORTANT NOTE: The first course meeting will be on Monday Jan 23 from 12-3.

URL for this page:
www.cis.upenn.edu/~mkearns/teaching/COLT

Previous incarnations of this course:
www.cis.upenn.edu/~mkearns/teaching/COLT/colt16.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt15.html (with Grigory Yaroslavtsev)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt12.html (with Jake Abernethy)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt08.html (with Koby Crammer)


COURSE DESCRIPTION

This course is an introduction to Computational Learning Theory, a field which attempts to provide algorithmic, complexity-theoretic and probabilistic foundations to modern machine learning and related topics.

The first part of the course will closely follow portions of An Introduction to Computational Learning Theory, by M. Kearns and U. Vazirani (MIT Press). We will cover perhaps 6 or 7 of the chapters in K&V over (approximately) the first half of the course, often supplementing with additional readings and materials. Copies of K&V will be available at the Penn bookstore. The second portion of the course will focus on a number of models and topics in learning theory and related areas not covered in K&V.

The course will give a broad overview of the kinds of problems and techniques typically studied in Computational Learning Theory, and provide a basic arsenal of powerful mathematical tools for analyzing machine learning problems.

Topics likely to be covered include:

  • Basics of the Probably Approximately Correct (PAC) Learning Model
  • Occam's Razor, Compression and Learning
  • Uniform Convergence and the Vapnik-Chervonenkis Dimension
  • Boosting
  • Learning in the Presence of Noise and Statistical Query Learning
  • Learning and Cryptography
  • Query Models
  • Online and No-Regret Learning Models
  • Agnostic Learning

    Additional topics we may also cover include:

  • Learning and Differential Privacy
  • Fairness in Machine Learning
  • Theory of Reinforcement Learning


    COURSE FORMAT, REQUIREMENTS, AND PREREQUISITES

    Much of the course will be in fairly traditional "chalk talk" lecture format, but with ample opportunity for discussion, participation, and critique. The course will meet once a week on Mondays from 12 to 3, with the first meeting on Jan 23. Lunch will be served.

    The course will involve advanced mathematical material and will cover formal proofs in detail, and will be taught at the doctoral level. While there are no specific formal prerequisites, background or courses in algorithms, complexity theory, discrete math, combinatorics, probability theory and statistics will prove helpful, as will "mathematical maturity" in general. We will be examining detailed proofs throughout the course. If you have questions about the desired background, please ask. Auditors and occasional participants are welcome.

    The course requirements for registered students will be a mixture of active in-class participation, problem sets, possibly leading a class discussion, and a final project. The final projects can range from actual research work, to a literature survey, to solving some additional problems.


    DETAILED MEETING/TOPIC SCHEDULE
    The exact timing and set of topics below will depend on our progress and will be updated as we proceed.

    Mon Jan 23
    In the first meeting, we will go over course mechanics and present a course overview, then immediately begin investigating the Probably Approximately Correct (PAC) model of learning. Detailed topics covered: Learning rectangles in the real plane; definition of the PAC model; PAC-learnability of rectangles in d dimensions.

    READING: K&V Chapter 1.

    Mon Jan 30
    PAC-learnability of conjunctions of boolean variables. Intractability of PAC-learning 3-term DNF.

    READING: K&V Chapter 1.

    Mon Feb 6
    Learning 3-term DNF by 3CNF and the importance of hypothesis representation. Occam's Razor and consistent hypotheses.

    READING: K&V Chapters 1 & 2.

    Mon Feb 13
    Sample complexity and the Vapnik-Chervonenkis dimension.

    READING: K&V Chapter 3.

    Mon Feb 20
    Extensions and generalizations of the VC dimension: matching lower bound; unrealizable/agnostic setting; more refined complexity measures; complexity regularization and structural risk minimization; other loss functions.

    READING: "Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications", D. Haussler 1992.

    Mon Feb 27
    Learning with classification noise and the statistical query model.

    READING: K&V Chapter 5.

    Course Problem Set:
    We'll have just a single problem set for the term, in which you are asked to solve any 12 problems from K&V, subject to the conditions that they are chosen from at least 4 different chapters, and that you make an effort to tackle a range of difficulty in your chosen problems. You are free to collaborate on the problem set, but please turn in separate write-ups and acknowledge your collaborations. Please do not attempt to look up solutions in the literature. Due date will be Mon March 27.

    Mon Mar 6
    No class, Penn Spring Break week.

    Mon Mar 13
    SQ learning continued: exponential lower bound for parity functions; SQ dimension and "nearly orthogonal" functions; superpolynomial lower bound for DNF and decision trees. Learning and cryptography: one-time pad; representation-independent hardness of learning finite automata, neural networks and Boolean formulae via public-key cryptography.

    READING: K&V Chapters 5 and 6.

    Mon Mar 20
    Weak and strong PAC learning. Schapire's original boosting construction. Adaboost.

    READING: K&V Chapter 4.

    Mon Mar 27
    Learning with membership and equivalence queries; an algorithm for finite automata; the PAC solar system. Discussion of course final projects.

    READING: K&V Chapter 8.

    Mon Apr 3
    Guest lecture by Jamie Morgenstern on PAC learning and mechanism design. Problem set due as hard copy in class.

    Mon Apr 10

    Mon Apr 17

    Mon Apr 24