Natural images, Gaussian mixtures and dead leaves

Daniel Zoran, Yair Weiss

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Simple Gaussian Mixture Models (GMMs) learned from pixels of natural image patches have been recently shown to be surprisingly strong performers in modeling the statistics of natural images. Here we provide an in depth analysis of this simple yet rich model. We show that such a GMM model is able to compete with even the most successful models of natural images in log likelihood scores, denoising performance and sample quality. We provide an analysis of what such a model learns from natural images as a function of number of mixture components - including covariance structure, contrast variation and intricate structures such as textures, boundaries and more. Finally, we show that the salient properties of the GMM learned from natural images can be derived from a simplified Dead Leaves model which explicitly models occlusion, explaining its surprising success relative to other models.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 15 - Proceedings of the 2002 Conference, NIPS 2002
PublisherNeural information processing systems foundation
ISBN (Print)0262025507, 9780262025508
StatePublished - 2003
Event16th Annual Neural Information Processing Systems Conference, NIPS 2002 - Vancouver, BC, Canada
Duration: 9 Dec 200214 Dec 2002

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference16th Annual Neural Information Processing Systems Conference, NIPS 2002
Country/TerritoryCanada
CityVancouver, BC
Period9/12/0214/12/02

Fingerprint

Dive into the research topics of 'Natural images, Gaussian mixtures and dead leaves'. Together they form a unique fingerprint.

Cite this