Learning complexity vs communication complexity

Nati Linial*, Adi Shraibman

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

46 Scopus citations

Abstract

This paper has two main focal points. We first consider an important class of machine learning algorithms: large margin classifiers, such as Support Vector Machines. The notion of margin complexity quantifies the extent to which a given class of functions can be learned by large margin classifiers. We prove that up to a small multiplicative constant, margin complexity is equal to the inverse of discrepancy. This establishes a strong tie between seemingly very different notions from two distinct areas. In the same way that matrix rigidity is related to rank, we introduce the notion of rigidity of margin complexity. We prove that sign matrices with small margin complexity rigidity are very rare. This leads to the question of proving lower bounds on the rigidity of margin complexity. Quite surprisingly, this question turns out to be closely related to basic open problems in communication complexity, e.g., whether PSPACE can be separated from the polynomial hierarchy in communication complexity. Communication is a key ingredient in many types of learning. This explains the relations between the field of learning theory and that of communication complexity [6, l0, 16, 26]. The results of this paper constitute another link in this rich web of relations. These new results have already been applied toward the solution of several open problems in communication complexity [18, 20, 29].

Original languageEnglish
Pages (from-to)227-245
Number of pages19
JournalCombinatorics Probability and Computing
Volume18
Issue number1-2
DOIs
StatePublished - Mar 2009

Fingerprint

Dive into the research topics of 'Learning complexity vs communication complexity'. Together they form a unique fingerprint.

Cite this