## Abstract

We study the problem of learning from unlabeled samples very general statistical mixture models on large finite sets. Specifically, the model to be learned, θ, is a probability distribution over probability distributions p, where each such p is a probability distribution over [n] = {1,2,...,n}. When we sample from θ, we do not observe p directly, but only indirectly and in very noisy fashion, by sampling from [n] repeatedly, independently K times from the distribution p. The problem is to infer θ to high accuracy in transportation (earthmover) distance. We give the first efficient algorithms for learning this mixture model without making any restricting assumptions on the structure of the distribution θ. We bound the quality of the solution as a function of the size of the samples K and the number of samples used. Our model and results have applications to a variety of unsupervised learning scenarios, including learning topic models and collaborative filtering.

Original language | English |
---|---|

Title of host publication | STOC 2015 - Proceedings of the 2015 ACM Symposium on Theory of Computing |

Publisher | Association for Computing Machinery |

Pages | 743-752 |

Number of pages | 10 |

ISBN (Electronic) | 9781450335362 |

DOIs | |

State | Published - 14 Jun 2015 |

Event | 47th Annual ACM Symposium on Theory of Computing, STOC 2015 - Portland, United States Duration: 14 Jun 2015 → 17 Jun 2015 |

### Publication series

Name | Proceedings of the Annual ACM Symposium on Theory of Computing |
---|---|

Volume | 14-17-June-2015 |

ISSN (Print) | 0737-8017 |

### Conference

Conference | 47th Annual ACM Symposium on Theory of Computing, STOC 2015 |
---|---|

Country/Territory | United States |

City | Portland |

Period | 14/06/15 → 17/06/15 |

### Bibliographical note

Publisher Copyright:© Copyright 2015 ACM.

## Keywords

- Approximation theory Convex geometry
- Kantorovich-Rubinstein duality
- Mixture learning
- Randomized algorithms
- Spectral methods
- Transportation distance