Abstract
We consider the problem of estimating a vector μ = (μ1, . . , μn) under a squared loss, based on independent observations Yi ~ N(μi , 1), i = 1, . . , n, and possibly extra structural assumptions. We argue that many estimators are asymptotically equal to μi = αμi + (1 - α)Y1+ ζi = μi + (1 - α)(Yi- μi ) + ζi , where α ϵ [0, 1] and μi may depend on the data, but is not a function of Yi, and Σ ζ 2 i = op(n). We consider the optimal estimator of the form μi +g(Yi - μi ) for a general, possibly random, function g, and approximate it using nonparametric empirical Bayes ideas and techniques. We consider both the retrospective and the sequential estimation problems. We elaborate and demonstrate our results on the case where μi are Kalman filter estimators. Simulations and a real data analysis are also provided.
| Original language | English |
|---|---|
| Pages (from-to) | 3459-3478 |
| Number of pages | 20 |
| Journal | Bernoulli |
| Volume | 25 |
| Issue number | 4 B |
| DOIs | |
| State | Published - 2019 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2019 ISI/BS.
Keywords
- Empirical Bayes
- Exchangeable
- Kalman filter
- Shrinkage estimators
Fingerprint
Dive into the research topics of 'Nonparametric empirical Bayes improvement of shrinkage estimators with applications to time series'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver