• Title/Summary/Keyword: Stein variance estimator

Search Result 5, Processing Time 0.023 seconds

An Improvement of the James-Stein Estimator with Some Shrinkage Points using the Stein Variance Estimator

  • Lee, Ki Won;Baek, Hoh Yoo
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.4
    • /
    • pp.329-337
    • /
    • 2013
  • Consider a p-variate($p{\geq}3$) normal distribution with mean ${\theta}$ and covariance matrix ${\sum}={\sigma}^2{\mathbf{I}}_p$ for any unknown scalar ${\sigma}^2$. In this paper we improve the James-Stein estimator of ${\theta}$ in cases of shrinking toward some vectors using the Stein variance estimator. It is also shown that this domination does not hold for the positive part versions of these estimators.

An approach to improving the James-Stein estimator shrinking towards projection vectors

  • Park, Tae Ryong;Baek, Hoh Yoo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.6
    • /
    • pp.1549-1555
    • /
    • 2014
  • Consider a p-variate normal distribution ($p-q{\geq}3$, q = rank($P_V$) with a projection matrix $P_V$). Using a simple property of noncentral chi square distribution, the generalized Bayes estimators dominating the James-Stein estimator shrinking towards projection vectors under quadratic loss are given based on the methods of Brown, Brewster and Zidek for estimating a normal variance. This result can be extended the cases where covariance matrix is completely unknown or ${\sum}={\sigma}^2I$ for an unknown scalar ${\sigma}^2$.

An improvement of estimators for the multinormal mean vector with the known norm

  • Kim, Jaehyun;Baek, Hoh Yoo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.2
    • /
    • pp.435-442
    • /
    • 2017
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\theta}$ (p ${\geq}$ 3) under the quadratic loss from multi-variate normal population. We find a James-Stein type estimator which shrinks towards the projection vectors when the underlying distribution is that of a variance mixture of normals. In this case, the norm ${\parallel}{\theta}-K{\theta}{\parallel}$ is known where K is a projection vector with rank(K) = q. The class of this type estimator is quite general to include the class of the estimators proposed by Merchand and Giri (1993). We can derive the class and obtain the optimal type estimator. Also, this research can be applied to the simple and multiple regression model in the case of rank(K) ${\geq}2$.

Improvement of the Modified James-Stein Estimator with Shrinkage Point and Constraints on the Norm

  • Kim, Jae Hyun;Baek, Hoh Yoo
    • Journal of Integrative Natural Science
    • /
    • v.6 no.4
    • /
    • pp.251-255
    • /
    • 2013
  • For the mean vector of a p-variate normal distribution ($p{\geq}4$), the optimal estimation within the class of modified James-Stein type decision rules under the quadratic loss is given when the underlying distribution is that of a variance mixture of normals and when the norm ${\parallel}{\theta}-\bar{\theta}1{\parallel}$ it known.

Estimators Shrinking towards Projection Vector for Multivariate Normal Mean Vector under the Norm with a Known Interval

  • Baek, Hoh Yoo
    • Journal of Integrative Natural Science
    • /
    • v.11 no.3
    • /
    • pp.154-160
    • /
    • 2018
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\theta}(p-r{\geq}3)$, r = rank(K) with a projection matrix K under the quadratic loss, based on a sample $Y_1$, $Y_2$, ${\cdots}$, $Y_n$. In this paper a James-Stein type estimator with shrinkage form is given when it's variance distribution is specified and when the norm ${\parallel}{\theta}-K{\theta}{\parallel}$ is constrain, where K is an idempotent and symmetric matrix and rank(K) = r. It is characterized a minimal complete class of James-Stein type estimators in this case. And the subclass of James-Stein type estimators that dominate the sample mean is derived.