Probabilistic Analysis of Regularization

Daniel Keren, Michael Werman

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

In order to wisely use interpolated data, it is important to have reliability and confidence measures associated with it. In this paper, we show how to compute the reliability at each point of any linear functional, for example, height or derivative, of a surface reconstructed using regularization The proposed method is to define a probability structure on the class of possible objects (for example surfaces) and compute the variance of the corresponding random variable (for example, the height at a certain point). This variance is a natural measure for uncertainty, and experiments have shown it to correlate well with reality. The probability distribution used is based on the Boltzmann distribution. The theoretical part of the work utilizes tools from classical analysis, functional analysis, and measure theory on function spaces. The theory was tested and applied to real depth images. It was also applied to formalize a paradigm of optimal sampling, which was successfully tested on real depth images.

Original languageEnglish
Pages (from-to)982-995
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume15
Issue number10
DOIs
StatePublished - Oct 1993

Keywords

  • Function spaces
  • regularization
  • surface reconstruction
  • uncertainty

Fingerprint

Dive into the research topics of 'Probabilistic Analysis of Regularization'. Together they form a unique fingerprint.

Cite this