ONNEGATIVE MINIMUM BIASED ESTIMATION IN VARIANCE COMPONENT MODELS

  • Published : 1989.06.25

Abstract

In a general variance component model, nonnegative quadratic estimators of the components of variance are considered which are invariant with respect to mean value translaion and have minimum bias (analogously to estimation theory of mean value parameters). Here the minimum is taken over an appropriate cone of positive semidefinite matrices, after having made a reduction by invariance. Among these estimators, which always exist the one of minimum norm is characterized. This characterization is achieved by systems of necessary and sufficient condition, and by a cone restricted pseudoinverse. In models where the decomposing covariance matrices span a commutative quadratic subspace, a representation of the considered estimator is derived that requires merely to solve an ordinary convex quadratic optimization problem. As an example, we present the two way nested classification random model. An unbiased estimator is derived for the mean squared error of any unbiased or biased estimator that is expressible as a linear combination of independent sums of squares. Further, it is shown that, for the classical balanced variance component models, this estimator is the best invariant unbiased estimator, for the variance of the ANOVA estimator and for the mean squared error of the nonnegative minimum biased estimator. As an example, the balanced two way nested classification model with ramdom effects if considered.

Keywords