Abstract
Covariance estimation is a core part of adaptive target detection. Most of the works focus on the Mean Squared Error (MSE) metric because it is easy to work with. However, MSE does not always capture the statistical information needed for detection. We advocate for switching to the Kullback-Leibler (KL) divergence. To support this, we analyze the Normalized Signal to Noise Ratio (NSNR) associated with the worst-case target. We show that the KL metric has a structure similar to NSNR and bounds it. To further clarify our point, we derive a simple variant of a classic MSE-based estimator by incorporating KL in a leave-one-out cross-validation (LOOCV) framework. Numerical experiments with various estimators on both synthetic and real data also demonstrate that KL and NSNR behave similarly and are different than MSE. Simply changing the metric in the LOOCV estimator improves KL and NSNR performance while reducing MSE performance.
| Original language | English |
|---|---|
| Title of host publication | 2025 IEEE Statistical Signal Processing Workshop, SSP 2025 |
| Publisher | IEEE Computer Society |
| Pages | 101-105 |
| Number of pages | 5 |
| ISBN (Electronic) | 9798331518004 |
| DOIs | |
| State | Published - 2025 |
| Event | 2025 IEEE Statistical Signal Processing Workshop, SSP 2025 - Edinburgh, United Kingdom Duration: 8 Jun 2025 → 11 Jun 2025 |
Publication series
| Name | IEEE Workshop on Statistical Signal Processing Proceedings |
|---|---|
| ISSN (Print) | 2373-0803 |
| ISSN (Electronic) | 2693-3551 |
Conference
| Conference | 2025 IEEE Statistical Signal Processing Workshop, SSP 2025 |
|---|---|
| Country/Territory | United Kingdom |
| City | Edinburgh |
| Period | 8/06/25 → 11/06/25 |
Bibliographical note
Publisher Copyright:© 2025 IEEE.