Abstract
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the ℓp estimation loss for 1 ≤ p ≤ 2 in the linear model when the number of variables can be much larger than the sample size.
Original language | English |
---|---|
Pages (from-to) | 1705-1732 |
Number of pages | 28 |
Journal | Annals of Statistics |
Volume | 37 |
Issue number | 4 |
DOIs | |
State | Published - Aug 2009 |
Keywords
- Linear models
- Model selection
- Nonparametric statistics