Spring 2017
Quiz 5
Note: answers are bolded

The size of the largest set of points that can be shattered by a linear threshold function mapping from ℝ to {0, 1} is?
 One point
 Two points
 Three points
 Four points

What is the difference between the standard PAC learning setting and the agnostic PAC learning setting?
 In standard PAC learning, the sample complexity is polynomial in 1/ε, 1/δ, and n, while in agnostic PAC learning this is not true.
 In standard PAC learning, the generalization bounds are based on the size of the hypothesis class, while in agnostic PAC learning, generalization bounds are based on the VC dimension of the hypothesis class.
 In standard PAC learning, efficient learnability requires that the time required to learn is polynomial in 1/ε, 1/δ, and n, while in agnostic PAC learning this is not true.
 In standard PAC learning, the hypothesis is required to be consistent with the training data, while in agnostic PAC learning this is not necessary.

What can be said about the learnability of concept class C where the VC dimension of C is not finite?
 C may or may not be learnable.
 C is not PAC learnable.
 C is PAC learnable because any number of examples can be shattered.
 C is PAC learnable because there is a lower bound on the number of examples for which with probability at least (1δ), c ∈ C has error less than ε.

While trying to determine the VCdimension of a concept class H, we find that we can shatter all samples of size = 3. With some effort, we find that all sample of size 5 cannot be shattered by H. Given this information which of the following statements is most accurate about the VCdimension of H (represented as VC(H))?
 VC(H) = 3
 VC(H) = 5
 VC(H) can be 3 or 4
 VC(H) > 5

Given a data set S, you have the choice of two hypothesis spaces to use, H1 ⊆ H2.
Disregarding computational complexity issues, which one would you choose?
 Always H2, since it is more likely that I’ll find a hypothesis that is consistent with S.
 Always H1, due to Occam’s Razor.
 H1, unless I know that it is not expressive enough.
Dan Roth