Convex Nonparanormal Regression

Yonatan Woodbridge*, Gal Elidan, Ami Wiesel

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Quantifying uncertainty in predictions or, more generally, estimating the posterior conditional distribution, is a core challenge in machine learning and statistics. We introduce Convex Nonparanormal Regression (CNR), a conditional nonparanormal approach for coping with this task. CNR involves a convex optimization of a posterior defined via a rich dictionary of pre-defined non linear transformations on Gaussians. It can fit an arbitrary conditional distribution, including multimodal and non-symmetric posteriors. For the special but powerful case of a piecewise linear dictionary, we provide a closed form of the posterior mean which can be used for point-wise predictions. Finally, we demonstrate the advantages of CNR over classical competitors using synthetic and real world data.

Original languageEnglish
Article number9508166
Pages (from-to)1680-1684
Number of pages5
JournalIEEE Signal Processing Letters
Volume28
DOIs
StatePublished - 2021

Bibliographical note

Publisher Copyright:
© 1994-2012 IEEE.

Keywords

  • Linear regression
  • convex optimization
  • nonparanormal distribution

Fingerprint

Dive into the research topics of 'Convex Nonparanormal Regression'. Together they form a unique fingerprint.

Cite this