Sparse image coding using a 3D non-negative tensor factorization

Tamir Hazan*, Simon Polak, Amnon Shashua

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

160 Scopus citations

Abstract

We introduce an algorithm for a non-negative 3D tensor factorization for the purpose of establishing a local parts feature decomposition from an object class of images. In the past such a decomposition was obtained using non-negative matrix factorization (NMF) where images were vectorized before being factored by NMF. A tensor factorization (NTF) on the other hand preserves the 2D representations of images and provides a unique factorization (unlike NMF which is not unique). The resulting "factors" from the NTF factorization are both sparse (like with NMF) but also separable allowing efficient convolution with the test image. Results show a superior decomposition to what an NMF can provide on all fronts - degree of sparsity, lack of ghost residue due to invariant parts and efficiency of coding of around an order of magnitude better. Experiments on using the local parts decomposition for face detection using SVM and Adaboost classifiers demonstrate that the recovered features are discriminatory and highly effective for classification.

Original languageEnglish
Title of host publicationProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005
Pages50-57
Number of pages8
DOIs
StatePublished - 2005
EventProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005 - Beijing, China
Duration: 17 Oct 200520 Oct 2005

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
VolumeI

Conference

ConferenceProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005
Country/TerritoryChina
CityBeijing
Period17/10/0520/10/05

Fingerprint

Dive into the research topics of 'Sparse image coding using a 3D non-negative tensor factorization'. Together they form a unique fingerprint.

Cite this