A bayesian comparison of some estimators used in linear regression with multicollinear data

Samuel D. Oman*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Several estimators (ridge, principal components, generalized inverse and Stein) have been proposed as alternatives to least squares for the multiple linear regression model when the independent variables are multicollinear. These methods differ in the way they adjust the least squares estimate to where the regression vector β “ought to be”. From a Bayesian point of view, they assume different prior distributions for β. In this paper t is expressed in such a way that the assumptions about the model which are implicit in a given prior distribution of jg become apparent. The Stein estimate can be viewed as assuming the independent variables to be “inherently multicollinear”, while the ridge estimate assumes they are not. Principal components and generalized inverse estimators correspond to a somewhat peculiar set of prior assumptions. A modification of the Stein estimate which is “smoother” than the principal components estimator is proposed.

Original languageEnglish
Pages (from-to)517-534
Number of pages18
JournalCommunications in Statistics - Theory and Methods
Volume7
Issue number6
DOIs
StatePublished - 1 Jan 1978
Externally publishedYes

Keywords

  • biased estimation
  • principal components regression
  • ridge regression
  • shrinkage estimators
  • Stein estimate

Fingerprint

Dive into the research topics of 'A bayesian comparison of some estimators used in linear regression with multicollinear data'. Together they form a unique fingerprint.

Cite this