Abstract
In many real world applications, the number of examples to learn from is plentiful, but we can only obtain limited information on each individual example. We study the possibilities of efficient, provably correct, large-scale learning in such settings. The main theme we would like to establish is that large amounts of examples can compensate for the lack of full information on each individual example. The type of partial information we consider can be due to inherent noise or from constraints on the type of interaction with the data source. In particular, we describe and analyze algorithms for budgeted learning, in which the learner can only view a few attributes of each training example (Cesa-Bianchi, Shalev-Shwartz, and Shamir 2010a; 2010c), and algorithms for learning kernel-based predictors, when individual examples are corrupted by random noise (Cesa-Bianchi, Shalev-Shwartz, and Shamir 2010b).
Original language | English |
---|---|
Title of host publication | Proceedings of the 25th AAAI Conference on Artificial Intelligence, AAAI 2011 |
Publisher | AAAI Press |
Pages | 1547-1550 |
Number of pages | 4 |
ISBN (Electronic) | 9781577355083 |
State | Published - 11 Aug 2011 |
Event | 25th AAAI Conference on Artificial Intelligence, AAAI 2011 - San Francisco, United States Duration: 7 Aug 2011 → 11 Aug 2011 |
Publication series
Name | Proceedings of the 25th AAAI Conference on Artificial Intelligence, AAAI 2011 |
---|
Conference
Conference | 25th AAAI Conference on Artificial Intelligence, AAAI 2011 |
---|---|
Country/Territory | United States |
City | San Francisco |
Period | 7/08/11 → 11/08/11 |
Bibliographical note
Publisher Copyright:Copyright © 2011, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.