Results on learnability and the Vapnik-Chervonenkis dimension

Nathan Linial*, Yishay Mansour, Ronald L. Rivest

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

We consider the problem of learning a concept from examples in the distribution-free model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from examples. Here we do not assume that the target concept T is a member of the concept class C from which approximations are chosen. This problem takes on particular interest when the VC dimension of C is infinite. Finally, we discuss the problem of computing the VC dimension of a finite concept set defined on a finite domain and consider the structure of classes of a fixed small dimension.

Original languageEnglish
Pages (from-to)33-49
Number of pages17
JournalInformation and Computation
Volume90
Issue number1
DOIs
StatePublished - Jan 1991

Bibliographical note

Funding Information:
* This paper was prepared with support from NSF Grant DCR-8607494, AR0 Grant DAAL-03-86-K-0171, and the Siemens Corporation.

Fingerprint

Dive into the research topics of 'Results on learnability and the Vapnik-Chervonenkis dimension'. Together they form a unique fingerprint.

Cite this