Abstract
Current non- and semiparametric statistics are conventionally viewed as one field. In this paper we shall argue that the people working in these two areas constitute two cultures arising from distinct historical roots and operating under two different paradigms. Roughly speaking, modern nonparametric statistics arose from descriptive statistics and is concerned with the estimation of "ill-conditioned" objects. Modern semiparametric statistics is concerned with more classical inference concerning the behavior of well-conditioned infinite-dimensional objects or finite-dimensional objects in the presence of infinitely dimensional nuisance parameters. We present a framework for thinking about the goals of non- and semiparametric statistics. We consider the notion of parameters robust with respect to certain loss functions as ideal objects for statistical inference. We discuss regularization as a general principle of estimation of non-robust parameters with examples ranging from kernel density estimation, to Vapnik's statistical learning theory, the method of sieves, and the m out of n bootstrap. We then discuss the plug in principle, and we close with a radical proposal for inference for nonrobust parameters.
Original language | English |
---|---|
Pages (from-to) | 209-228 |
Number of pages | 20 |
Journal | Journal of Statistical Planning and Inference |
Volume | 91 |
Issue number | 2 |
DOIs | |
State | Published - 1 Dec 2000 |
Keywords
- 62G20
- 62G30
- 62G35
- Bootstrap
- Nonparametric statistics
- Plug-in
- Primary 62G07
- Robustness
- Secondary 62G09
- Semiparametric statistics