Effective Semisupervised Learning on Manifolds.

Amir Globerson, Roi Livni, Shai Shalev-Shwartz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The abundance of unlabeled data makes semi-supervised learning (SSL) an attractive approach for improving the accuracy of learning systems. However, we are still far from a complete theoretical understanding of the benefits of this learning scenario in terms of sample complexity. In particular, for many natural learning settings it can in fact be shown that SSL does not improve sample complexity. Thus far, the only case where SSL provably helps, without compatibility assumptions, is a recent combinatorial construction of Darnstadt et al. Deriving similar theoretical guarantees for more commonly used approaches to SSL remains a challenge. Here, we provide the first analysis of manifold based SSL, where there is a provable gap between supervised learning and SSL, and this gap can be arbitrarily large. Proving the required lower bound is a technical challenge, involving tools from geometric measure theory. The algorithm we analyse is similar to subspace clustering, and thus our results demonstrate that this method can be used to improve sample complexity.
Original languageEnglish
Title of host publicationCOLT 2017
PublisherPMLR
Pages978-1003
Number of pages26
StatePublished - 2017
EventConference on Learning Theory, COLT 2017 - Amsterdam, Netherlands
Duration: 7 Jul 201710 Jul 2017
https://proceedings.mlr.press/v65

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume65
ISSN (Electronic)2640-3498

Conference

ConferenceConference on Learning Theory, COLT 2017
Abbreviated titleCOLT 2017
Country/TerritoryNetherlands
CityAmsterdam
Period7/07/1710/07/17
Internet address

Fingerprint

Dive into the research topics of 'Effective Semisupervised Learning on Manifolds.'. Together they form a unique fingerprint.

Cite this