Abstract
In this paper, we consider the problem of estimating an unknown deterministic parameter vector in a linear regression model with random Gaussian uncertainty in the mixing matrix. We prove that the maximum-likelihood (ML) estimator is a (de)regularized least squares estimator and develop three alternative approaches for finding the regularization parameter that maximizes the likelihood. We analyze the performance using the Cramér-Rao bound (CRB) on the mean squared error, and show that the degradation in performance due the uncertainty is not as severe as may be expected. Next, we address the problem again assuming that the variances of the noise and the elements in the model matrix are unknown and derive the associated CRB and ML estimator. We compare our methods to known results on linear regression in the error in variables (EIV) model. We discuss the similarity between these two competing approaches, and provide a thorough comparison that sheds light on their theoretical and practical differences.
Original language | English |
---|---|
Pages (from-to) | 2194-2205 |
Number of pages | 12 |
Journal | IEEE Transactions on Signal Processing |
Volume | 56 |
Issue number | 6 |
DOIs | |
State | Published - Jun 2008 |
Externally published | Yes |
Bibliographical note
Funding Information:Manuscript received March 8, 2007; revised November 1, 2007. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Antonia Papandreou-Suppappola. The work of A. Wiesel and Y. C. Eldar was supported by the European Union 6th framework program via the NEWCOM and NEWCOM++ networks of excellence and by the Israel Science Foundation. Some of the results in this paper were presented at the 2006 IEEE International Conference on Acoustics, Speech and Signal Processing.
Keywords
- Errors in variables (EIV)
- Linear models
- Maximum-likelihood (ML) estimation
- Random model matrix
- Total least squares