Generalized ridge regression, least squares with stochastic prior information, and Bayesian estimators

作者:

Highlights:

摘要

The ridge estimator of the usual linear model is generalized by the introduction of an a priori vector r and an associated positive semidefinite matrix S. It is then shown that the generalized ridge estimator can be justified in two ways: (a) by the minimization of the residual sum of squares subject to a constraint on the length, in the metric S, of the vector of differences between r and the estimated linear model coefficients, (b) by incorporating prior knowledge, r playing the role of the vector of means and S proportional to the precision matrix. Both a Bayesian and an Aitken generalized least squares frameworks are used for the latter. The properties of the new estimator are derived and compared to the ordinary least squares estimator. The new method is illustrated with different assumptions on the form of the S matrix.

论文关键词:

论文评审过程:Available online 21 March 2002.

论文官网地址:https://doi.org/10.1016/0096-3003(80)90002-8