Natural images, Gaussian mixtures and dead leaves

Daniel Zoran, Yair Weiss

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

78 Scopus citations

Abstract

Simple Gaussian Mixture Models (GMMs) learned from pixels of natural image patches have been recently shown to be surprisingly strong performers in modeling the statistics of natural images. Here we provide an in depth analysis of this simple yet rich model. We show that such a GMM model is able to compete with even the most successful models of natural images in log likelihood scores, denoising performance and sample quality. We provide an analysis of what such a model learns from natural images as a function of number of mixture components - including covariance structure, contrast variation and intricate structures such as textures, boundaries and more. Finally, we show that the salient properties of the GMM learned from natural images can be derived from a simplified Dead Leaves model which explicitly models occlusion, explaining its surprising success relative to other models.

Original languageAmerican English
Title of host publicationAdvances in Neural Information Processing Systems 25
Subtitle of host publication26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Pages1736-1744
Number of pages9
StatePublished - 2012
Event26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012 - Lake Tahoe, NV, United States
Duration: 3 Dec 20126 Dec 2012

Publication series

NameAdvances in Neural Information Processing Systems
Volume3
ISSN (Print)1049-5258

Conference

Conference26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Country/TerritoryUnited States
CityLake Tahoe, NV
Period3/12/126/12/12

Fingerprint

Dive into the research topics of 'Natural images, Gaussian mixtures and dead leaves'. Together they form a unique fingerprint.

Cite this