Due 12:30pm Tuesday October 20.
I've put up some relevant lecture notes on
You are allowed to work in teams on these problems, but both team
members should be able to do all the problems by themselves, should
they ever need to.
- Figure out the VC dimension of the following concept classes, by
showing n points that can be shattered and giving a handwavy
explanation of why no set of n+1 points can be shattered. In
all these examples, the input space is two-dimensional.
- Unions of two half-planes (ie the OR of two perceptrons).
- A second-degree polynomian threshold, ie accept points
(x1,x2) for which
p(x1,x2)>0, where p(x1,x2) is a quadratic
- Let C1 and C2 both be concept classes
whose elements have one-dimensional inputs, with
VC(C1)=n1 and VC(C2)=n2.
Let C be the ``cross product'' of C1 and C2, ie
for each concept c1 in C1 and c2 in
C2 there is a concept c in C, and c accepts an input
(x1,x2) when c1 accepts x1
and c2 accepts x2.
Example: if C1 and C2 are both the concept
class of one-dimensional intervals (ie accept an input x if
a<x<b for constants a and b), then C would be the concept class
of axis-aligned rectangles.
- Determine how many traning samples are needed if you have a
multilayer perceptron with 200 weights and want a PAC guarantee of
less than a 2% chance of more than a 3% generalization error rate.
- Determine how few weights you need if you have 50,000 examples
in your training set and you want to make a PAC guarantee that your
multilayer perceptron will have less then a 5% chance of more than a
1% generalization error rate.