Probabilistic Simplex Component Analysis by Importance Sampling

Nerya Granot, Tzvi Diskin*, Nicolas Dobigeon, Ami Wiesel

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this letter we consider the problem of linear unmixing hidden random variables defined over the simplex with additive Gaussian noise, also known as probabilistic simplex component analysis (PRISM). Previous solutions to tackle this challenging problem were based on geometrical approaches or computationally intensive variational methods. In contrast, we propose a conventional expectation maximization (EM) algorithm which embeds importance sampling. For this purpose, the proposal distribution is chosen as a simple surrogate distribution of the target posterior that is guaranteed to lie in the simplex. It is based on fitting the Dirichlet parameters to the linear minimum mean squared error (LMMSE) approximation, which is accurate at high signal-to-noise ratio. Numerical experiments in different settings demonstrate the advantages of this adaptive surrogate over state-of-the-art methods.

Original languageEnglish
Pages (from-to)683-687
Number of pages5
JournalIEEE Signal Processing Letters
Volume30
DOIs
StatePublished - 2023

Bibliographical note

Publisher Copyright:
© 1994-2012 IEEE.

Keywords

  • Expectation maximization
  • importance sampling
  • simplex-structured matrix factorization

Fingerprint

Dive into the research topics of 'Probabilistic Simplex Component Analysis by Importance Sampling'. Together they form a unique fingerprint.

Cite this