I've put up some relevant lecture notes on this topic.

- Figure out the VC dimension of the following concept classes, by
showing
*n*points that can be shattered and giving a handwavy explanation of why no set of*n+1*points can be shattered. In all these examples, the input space is two-dimensional.- Unions of two half-planes (ie the OR of two perceptrons).
- A second-degree polynomian threshold, ie accept points
(x
_{1},x_{2}) for which p(x_{1},x_{2})>0, where p(x1,x2) is a quadratic polynomial. - Let C
_{1}and C_{2}both be concept classes whose elements have one-dimensional inputs, with VC(C_{1})=n_{1}and VC(C_{2})=n_{2}. Let C be the ``cross product'' of C_{1}and C_{2}, ie for each concept c_{1}in C_{1}and c_{2}in C_{2}there is a concept c in C, and c accepts an input (x_{1},x_{2}) when c_{1}accepts x_{1}*and*c_{2}accepts x_{2}.Example: if C

_{1}and C_{2}are both the concept class of one-dimensional intervals (ie accept an input x if a<x<b for constants a and b), then C would be the concept class of axis-aligned rectangles.

- Determine how many traning samples are needed if you have a multilayer perceptron with 200 weights and want a PAC guarantee of less than a 2% chance of more than a 3% generalization error rate.
- Determine how few weights you need if you have 50,000 examples in your training set and you want to make a PAC guarantee that your multilayer perceptron will have less then a 5% chance of more than a 1% generalization error rate.

Barak Pearlmutter <bap@cs.unm.edu>