The likelihood function of a finite mixture model is a non-convex function with multiple local maxima and commonly used iterative algorithms such as EM will converge to different solutions depending on initial conditions. In this paper we ask: is it possible to assess how far we are from the global maximum of the likelihood? Since the likelihood of a finite mixture model can grow unboundedly by centering a Gaussian on a single datapoint and shrinking the covariance, we constrain the problem by assuming that the parameters of the individual models are members of a large discrete set (e.g. estimating a mixture of two Gaussians where the means and variances of both Gaussians are members of a set of a million possible means and variances). For this setting we show that a simple upper bound on the likelihood can be computed using convex optimization and we analyze conditions under which the bound is guaranteed to be tight. This bound can then be used to assess the quality of solutions found by EM (where the final result is projected on the discrete set) or any other mixture estimation algorithm. For any dataset our method allows us to find a finite mixture model together with a dataset-specific bound on how far the likelihood of this mixture is from the global optimum of the likelihood.
|Original language||American English|
|Title of host publication||2016 23rd International Conference on Pattern Recognition, ICPR 2016|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||6|
|State||Published - 1 Jan 2016|
|Event||23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun, Mexico|
Duration: 4 Dec 2016 → 8 Dec 2016
|Name||Proceedings - International Conference on Pattern Recognition|
|Conference||23rd International Conference on Pattern Recognition, ICPR 2016|
|Period||4/12/16 → 8/12/16|
Bibliographical noteFunding Information:
This work has been supported by the the ISF. The authors wish to thank the anonymous reviewers for their helpful comments.
© 2016 IEEE.