On Universality and Training in Binary Hypothesis Testing

Michael Bell*, Yuval Kochman

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The classical binary hypothesis testing problem is revisited. We notice that when one of the hypotheses is composite, there is an inherent difficulty in defining an optimality criterion that is both informative and well-justified. For testing in the simple normal location problem (that is, testing for the mean of multivariate Gaussians), we overcome the difficulty as follows. In this problem there exists a natural 'hardness' order between parameters as for different parameters the error-probabilities curves (when the parameter is known) are either identical, or one dominates the other. We can thus define minimax performance as the worst-case among parameters which are below some hardness level. Fortunately, there exists a universal minimax test, in the sense that it is minimax for all hardness levels simultaneously. Under this criterion we also find the optimal test for composite hypothesis testing with training data. THIS criterion extends to the wide class of local asymptotic normal models, in an asymptotic sense where the approximation of the error probabilities is additive. Since we have the asymptotically optimal tests for composite hypothesis testing with and without training data, we quantify the loss of universality and gain of training data for these models.

Original languageAmerican English
Article number9395484
Pages (from-to)3824-3846
Number of pages23
JournalIEEE Transactions on Information Theory
Volume67
Issue number6
DOIs
StatePublished - Jun 2021

Bibliographical note

Publisher Copyright:
© 1963-2012 IEEE.

Keywords

  • Hypothesis testing
  • local asymptotic normality
  • min-max universality
  • normal location problem
  • training data

Fingerprint

Dive into the research topics of 'On Universality and Training in Binary Hypothesis Testing'. Together they form a unique fingerprint.

Cite this