Non-rigid dense correspondence with applications for image enhancement

Yoav HaCohen*, Eli Shechtman, Dan B. Goldman, Dani Lischinski

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

260 Scopus citations

Abstract

This paper presents a new efficient method for recovering reliable local sets of dense correspondences between two images with some shared content. Our method is designed for pairs of images depicting similar regions acquired by different cameras and enses, under non-rigid transformations, under different lighting, and over different backgrounds. We utilize a new coarse-to-fine scheme in which nearest-neighbor field computations using Generalized PatchMatch [Barnes et al. 2010] are interleaved with fitting a global non-linear parametric color model and aggregating consistent matching regions using locally adaptive constraints. Compared to previous correspondence approaches, our method combines the best of two worlds: It is dense, like optical flow and stereo reconstruction methods, and it is also robust to geometric and photometric variations, like sparse feature matching. We demonstrate the usefulness of our method using three applications for automatic example-based photograph enhancement: adjusting the tonal characteristics of a source image to match a reference, transferring a known mask to a new image, and kernel estimation for image deblurring.

Original languageEnglish
Title of host publicationProceedings of ACM SIGGRAPH 2011, SIGGRAPH 2011
Volume30
Edition4
DOIs
StatePublished - Jul 2011
EventACM SIGGRAPH 2011, SIGGRAPH 2011 - Vancouver, BC, Canada
Duration: 7 Aug 201111 Aug 2011

Conference

ConferenceACM SIGGRAPH 2011, SIGGRAPH 2011
Country/TerritoryCanada
CityVancouver, BC
Period7/08/1111/08/11

Keywords

  • Color transfer
  • Correspondence
  • Deblurring
  • Nearest neighbor field
  • Patch match

Fingerprint

Dive into the research topics of 'Non-rigid dense correspondence with applications for image enhancement'. Together they form a unique fingerprint.

Cite this