Abstract
The classical binary hypothesis testing problem is revisited. We notice that when one of the hypotheses is composite, there is an inherent difficulty in defining an optimality criterion that is both informative and well-justified. For testing in the simple normal location problem (that is, testing for the mean of multivariate Gaussians), we overcome the difficulty as follows. In this problem there exists a natural 'hardness' order between parameters as for different parameters the error-probabilities curves (when the parameter is known) are either identical, or one dominates the other. We can thus define minimax performance as the worst-case among parameters which are below some hardness level. Fortunately, there exists a universal minimax test, in the sense that it is minimax for all hardness levels simultaneously. Under this criterion we also find the optimal test for composite hypothesis testing with training data. THIS criterion extends to the wide class of local asymptotic normal models, in an asymptotic sense where the approximation of the error probabilities is additive. Since we have the asymptotically optimal tests for composite hypothesis testing with and without training data, we quantify the loss of universality and gain of training data for these models.
Original language | English |
---|---|
Article number | 9395484 |
Pages (from-to) | 3824-3846 |
Number of pages | 23 |
Journal | IEEE Transactions on Information Theory |
Volume | 67 |
Issue number | 6 |
DOIs | |
State | Published - Jun 2021 |
Bibliographical note
Publisher Copyright:© 1963-2012 IEEE.
Keywords
- Hypothesis testing
- local asymptotic normality
- min-max universality
- normal location problem
- training data