TY - JOUR
T1 - Sampling and representation complexity of revenue maximization
AU - Dughmi, Shaddin
AU - Han, L.
AU - Nisan, Noam
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2014.
PY - 2014
Y1 - 2014
N2 - We consider (approximate) revenue maximization in mechanisms where the distribution on input valuations is given via “black box” access to samples from the distribution. We analyze the following model: a single agent, m outcomes, and valuations represented as m-dimensional vectors indexed by the outcomes and drawn from an arbitrary distribution presented as a black box. We observe that the number of samples required – the sample complexity – is tightly related to the representation complexity of an approximately revenue-maximizing auction. Our main results are upper bounds and an exponential lower bound on these complexities. We also observe that the computational task of “learning” a good mechanism from a sample is nontrivial, requiring careful use of regularization in order to avoid over-fitting the mechanism to the sample. We establish preliminary positive and negative results pertaining to the computational complexity of learning a good mechanism for the original distribution by operating on a sample from said distribution.
AB - We consider (approximate) revenue maximization in mechanisms where the distribution on input valuations is given via “black box” access to samples from the distribution. We analyze the following model: a single agent, m outcomes, and valuations represented as m-dimensional vectors indexed by the outcomes and drawn from an arbitrary distribution presented as a black box. We observe that the number of samples required – the sample complexity – is tightly related to the representation complexity of an approximately revenue-maximizing auction. Our main results are upper bounds and an exponential lower bound on these complexities. We also observe that the computational task of “learning” a good mechanism from a sample is nontrivial, requiring careful use of regularization in order to avoid over-fitting the mechanism to the sample. We establish preliminary positive and negative results pertaining to the computational complexity of learning a good mechanism for the original distribution by operating on a sample from said distribution.
UR - http://www.scopus.com/inward/record.url?scp=84914126908&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-13129-0_22
DO - 10.1007/978-3-319-13129-0_22
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84914126908
SN - 0302-9743
VL - 8877
SP - 277
EP - 291
JO - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
JF - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ER -