PENN CIS 625, FALL 2021: THEORY OF MACHINE LEARNING

Prof. Michael Kearns
mkearns@cis.upenn.edu

Time: Tuesdays and Thursdays 10:15-11:45
Location: MacNeil 286-7
Attendance at lectures is a course requirement unless prior arrangements have been made.
Office Hours: Right after class on Thursdays or by appointment. Weather permitting, I'll hold OHs in the outdoor seating area near MacNeil, otherwise in one of my offices. Please let me know in advance if you intend to come to OHs.

Teaching Assistants:
Daniel Lee: laniel@seas.upenn.edu
Office Hours: Wednesdays 7PM, location TBA
Claudia Zhu: jiyunzhu@seas.upenn.edu
Office Hours: Tuesdays 3:30PM, location TBA

URL for this page:
www.cis.upenn.edu/~mkearns/teaching/CIS625

Previous incarnations of this course:
www.cis.upenn.edu/~mkearns/teaching/CIS625/cis625-20.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt18.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt17.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt16.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt15.html (with Grigory Yaroslavtsev)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt12.html (with Jake Abernethy)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt08.html (with Koby Crammer)


COURSE DESCRIPTION

This course is an introduction to the theory of machine learning, and provides mathematical, algorithmic, complexity-theoretic and probabilistic/statistical foundations to modern machine learning and related topics.

The first part of the course will follow portions of An Introduction to Computational Learning Theory, by M. Kearns and U. Vazirani (MIT Press). We will cover perhaps 6 or 7 of the chapters in K&V over (approximately) the first half of the course, often supplementing with additional readings and materials. I will provide electronic versions of the relevant chapters of K&V.

The second portion of the course will focus on a number of models and topics in learning theory and related areas not covered in K&V.

The course will give a broad overview of the kinds of theoretical problems and techniques typically studied and used in machine learning, and provide a basic arsenal of powerful mathematical tools for analyzing machine learning problems.

Topics likely to be covered include:

  • Basics of the Probably Approximately Correct (PAC) Learning Model
  • Occam's Razor, Compression and Learning
  • Concentration, Uniform Convergence and the Vapnik-Chervonenkis Dimension
  • Boosting and Related Techniques
  • Learning in the Presence of Noise and Statistical Query Learning
  • Learning and Cryptography
  • Query Models
  • Online and No-Regret Learning Models
  • Agnostic Learning
  • Game Theory and Machine Learning
  • Learning and Differential Privacy
  • Fairness in Machine Learning
  • Theory of Reinforcement Learning


    COURSE FORMAT, REQUIREMENTS, AND PREREQUISITES

    The lectures will be in a mixture of "chalk talk" format and markup of lecture notes, but with ample opportunity for discussion, participation, and critique. The course will meet Tuesdays and Thursdays from 10:15 to 11:45, with the first meeting on Tuesday August 31.

    The course will involve advanced mathematical material and will cover formal proofs in detail, and will be taught at the doctoral level. While there are no specific formal prerequisites, background or courses in algorithms, complexity theory, discrete math, combinatorics, convex optimization, probability theory and statistics will prove helpful, as will "mathematical maturity" in general. We will be examining detailed proofs throughout the course. If you have questions about the desired background, please ask. Auditors and occasional participants are welcome.

    The course requirements for registered students will be a mixture of active in-class participation, problem sets, possibly leading a class discussion, and a final project. The final projects can range from actual research work, to a literature survey, to solving some additional problems.


    MEETING/TOPIC SCHEDULE
    For those who have made prior arrangements for remote learning, and for review purposes, here is a link to
    last year's video lectures. We will undoubtedly deviate from both the schedule and content of these videos at some point, but they should prove helpful for those not in live attendance.

    The exact timing and set of topics below will depend on our progress and will be updated as we proceed.

    Tue Aug 31
    Course overview, topics, and mechanics. Introduction to rectangle learning problem.

  • Intro Lecture Notes
  • Rectangle Lecture Notes
  • K&V Chapter 1.

    Tue Sep 7
    Analysis of rectangle learning problem.

  • Rectangle Lecture Notes
  • K&V Chapter 1.

    Thu Sep 9
    Introduction to Probably Approximately Correct (PAC) model.

  • PAC Lecture Notes (We ended on about slide 7.)
  • K&V Chapter 1.

    Tue Sep 14
    Learning Boolean Conjunctions in the PAC Model.

  • PAC Lecture Notes (We ended on about slide 14.)
  • K&V Chapter 1.

    Thu Sep 16
    Intractability of Learning 3-term DNF in the PAC Model.

  • PAC Lecture Notes (We ended on about slide 24.)
  • K&V Chapter 1.

    Tue Sep 21
    Learning 3-term DNF by 3CNF.

  • PAC Lecture Notes (We got to the end.)
  • K&V Chapter 1.

    Thu Sep 23
    Finite H: Learning and Consistentcy; Learning Decision Lists; Uniform Convergence

  • Consistency and Compression Lecture Notes
  • K&V Chapter 2.

    Tue Sep 28
    Finite H: Occam's Razor and Adaptive Data Analysis

  • Consistency and Compression Lecture Notes
  • K&V Chapter 2.
  • Here is Problem Set 1, due Wed Sep 29 via Gradescope.

    Thu Sep 30
    Uniform Convergence for Infinite H: Introduction to the VC Dimension

  • VC Dimension Lecture Notes (We ended on about slide 17.)
  • K&V Chapter 3.

    Tue Oct 5
    VC Dimension Continued: Growth Function and Sauer's Lemma

  • VC Dimension Lecture Notes (We ended on slide 29.)
  • K&V Chapter 3.

    Thu Oct 7
    VC Dimension Continued: Two-Sample Trick and Main Theorem

  • VC Dimension Lecture Notes (We ended on slide 40.)
  • K&V Chapter 3.

    Tue Oct 12
    VC Dimension Continued: Lower Bounds, Structural Risk Minimization, Distribution-Specific Refinements

  • VC Dimension Lecture Notes (We ended on about slide 53.)
  • K&V Chapter 3.

    Tue Oct 19
    Generalizations of VC Dimension to Other Loss Functions. Introduction to Learning with Classification Noise

  • VC Dimension Lecture Notes
  • K&V Chapter 3.
  • Classification Noise Lecture Notes
  • K&V Chapter 5.
  • Here is Problem Set 2, due Sun Nov 7 via Gradescope.

    Thu Oct 21
    Learning with Classification Noise; the Statistical Query Model

  • Statistical Query Lecture Notes
  • K&V Chapter 5.

    Tue Oct 26
    Learning with Classification Noise; the Statistical Query Model

  • Statistical Query Lecture Notes
  • K&V Chapter 5.

    Tue Nov 2
    Learning with Classification Noise; the Statistical Query Model

  • Statistical Query Lecture Notes
  • K&V Chapter 5.

    Thu Nov 4
    Statistical Query Dimension; Impossibility Results for SQ; the PAC "Solar System"

  • Statistical Query Lecture Notes
  • K&V Chapter 5.
  • Here are Problem Set 1 Solutions. The mean was 73.7 and the standard deviation was 10.8.

    Tue Nov 9
    Weak and Strong PAC Learning: Boosting

  • Boosting Lecture Notes

    Thu Nov 11
    Weak and Strong PAC Learning: AdaBoost

  • Boosting Lecture Notes

    Tue Nov 16
    No-Regret Learning: Multiplicative Weights

  • No-Regret Lecture Notes

    Tue Nov 18
    No-Regret Learning and Game Theory

  • No-Regret Lecture Notes

    Tue Nov 30
    Fairness in Machine Learning

  • Fair ML Lecture Notes

    Thu Dec 2
    Fairness in Machine Learning

  • Fair ML Lecture Notes