Distance metric between 3D models and 2D images for recognition and classification

D. Weinshall*, R. Basri

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

Similarity measurements between 3D objects and 2D images are useful for the tasks of object recognition and classification. Existing systems typically use image metrics; namely, metrics that measure the difference in the image between the observed image and the nearest view of the object (e.g., the Euclidean distance between corresponding points). In this paper we introduce a different type of metrics: transformation metrics. These metrics penalize for the deformations applied to the object to produce the observed image. We present a transformation metric that optimally penalizes for `affine deformations' under weak-perspective. A closed-form solution, together with the nearest view according to this metric, are derived. The metric is shown to be equivalent to the Euclidean image metric, in the sense that they bound each other from both above and below. For the Euclidean image metric we offer a sub-optimal closed-form solution and an iterative scheme to compute the exact solution.

Original languageEnglish
Title of host publicationIEEE Computer Vision and Pattern Recognition
Editors Anon
PublisherPubl by IEEE
Pages220-225
Number of pages6
ISBN (Print)0818638826
StatePublished - 1993
EventProceedings of the 1993 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - New York, NY, USA
Duration: 15 Jun 199318 Jun 1993

Publication series

NameIEEE Computer Vision and Pattern Recognition

Conference

ConferenceProceedings of the 1993 IEEE Computer Society Conference on Computer Vision and Pattern Recognition
CityNew York, NY, USA
Period15/06/9318/06/93

Fingerprint

Dive into the research topics of 'Distance metric between 3D models and 2D images for recognition and classification'. Together they form a unique fingerprint.

Cite this