Abstract
We consider the problem of learning a concept from examples in the distribution-free model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from examples. Here we do not assume that the target concept T is a member of the concept class C from which approximations are chosen. This problem takes on particular interest when the VC dimension of C is infinite. Finally, we discuss the problem of computing the VC dimension of a finite concept set defined on a finite domain and consider the structure of classes of a fixed small dimension.
Original language | English |
---|---|
Pages (from-to) | 33-49 |
Number of pages | 17 |
Journal | Information and Computation |
Volume | 90 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1991 |
Bibliographical note
Funding Information:* This paper was prepared with support from NSF Grant DCR-8607494, AR0 Grant DAAL-03-86-K-0171, and the Siemens Corporation.