Comparing KL Divergence and MSE for Covariance Estimation in Target Detection

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Covariance estimation is a core part of adaptive target detection. Most of the works focus on the Mean Squared Error (MSE) metric because it is easy to work with. However, MSE does not always capture the statistical information needed for detection. We advocate for switching to the Kullback-Leibler (KL) divergence. To support this, we analyze the Normalized Signal to Noise Ratio (NSNR) associated with the worst-case target. We show that the KL metric has a structure similar to NSNR and bounds it. To further clarify our point, we derive a simple variant of a classic MSE-based estimator by incorporating KL in a leave-one-out cross-validation (LOOCV) framework. Numerical experiments with various estimators on both synthetic and real data also demonstrate that KL and NSNR behave similarly and are different than MSE. Simply changing the metric in the LOOCV estimator improves KL and NSNR performance while reducing MSE performance.

Original languageEnglish
Title of host publication2025 IEEE Statistical Signal Processing Workshop, SSP 2025
PublisherIEEE Computer Society
Pages101-105
Number of pages5
ISBN (Electronic)9798331518004
DOIs
StatePublished - 2025
Event2025 IEEE Statistical Signal Processing Workshop, SSP 2025 - Edinburgh, United Kingdom
Duration: 8 Jun 202511 Jun 2025

Publication series

NameIEEE Workshop on Statistical Signal Processing Proceedings
ISSN (Print)2373-0803
ISSN (Electronic)2693-3551

Conference

Conference2025 IEEE Statistical Signal Processing Workshop, SSP 2025
Country/TerritoryUnited Kingdom
CityEdinburgh
Period8/06/2511/06/25

Bibliographical note

Publisher Copyright:
© 2025 IEEE.

Fingerprint

Dive into the research topics of 'Comparing KL Divergence and MSE for Covariance Estimation in Target Detection'. Together they form a unique fingerprint.

Cite this