Abstract
The multivariate normal regression model, in which a vector y of responses is to be predicted by a vector x of explanatory variables, is considered. A hierarchical framework is used to express prior information on both x and y. An empirical Bayes estimator is developed which shrinks the maximum likelihood estimator of the matrix of regression coefficients across rows and columns to nontrivial subspaces which reflect both types of prior information. The estimator is shown to be minimax and is applied to a set of chemometrics data for which it reduces the cross-validated predicted mean squared error of the maximum likelihood estimator by 38%.
Original language | English |
---|---|
Pages (from-to) | 285-301 |
Number of pages | 17 |
Journal | Journal of Multivariate Analysis |
Volume | 80 |
Issue number | 2 |
DOIs | |
State | Published - 2002 |
Keywords
- James-Stein estimate
- Mean squared error
- Prior information
- Subspace shrinkage