• Title/Summary/Keyword: Estimator

Search Result 2,693, Processing Time 0.034 seconds

Improving Efficiency of the Moment Estimator of the Extreme Value Index

  • Yun, Seokhoon
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.3
    • /
    • pp.419-433
    • /
    • 2001
  • In this paper we introduce a method of improving efficiency of the moment estimator of Dekkers, Einmahl and de Haan(1989) for the extreme value index $\beta$. a new estimator of $\beta$ is proposed by adding the third moment ot the original moment estimator which is composed of the first two moments of the log-transformed sample data. We establish asymptotic normality of the new estimator and examine and adaptive procedure for the new estimator. The resulting adaptive estimator proves to be asymptotically better than the moment estimator particularly for $\beta$<0.

  • PDF

An alternative method for estimating lognormal means

  • Kwon, Yeil
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.4
    • /
    • pp.351-368
    • /
    • 2021
  • For a probabilistic model with positively skewed data, a lognormal distribution is one of the key distributions that play a critical role. Several lognormal models can be found in various areas, such as medical science, engineering, and finance. In this paper, we propose a new estimator for a lognormal mean and depict the performance of the proposed estimator in terms of the relative mean squared error (RMSE) compared with Shen's estimator (Shen et al., 2006), which is considered the best estimator among the existing methods. The proposed estimator includes a tuning parameter. By finding the optimal value of the tuning parameter, we can improve the average performance of the proposed estimator over the typical range of σ2. The bias reduction of the proposed estimator tends to exceed the increased variance, and it results in a smaller RMSE than Shen's estimator. A numerical study reveals that the proposed estimator has performance comparable with Shen's estimator when σ2 is small and exhibits a meaningful decrease in the RMSE under moderate and large σ2 values.

A Comparative Study for Several Bayesian Estimators Under Balanced Loss Function

  • Kim, Yeong-Hwa
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.2
    • /
    • pp.291-300
    • /
    • 2006
  • In this research, the performance of widely used Bayesian estimators such as Bayes estimator, empirical Bayes estimator, constrained Bayes estimator and constrained empirical Bayes estimator are compared by means of a measurement under balanced loss function for the typical normal-normal situation. The proposed measurement is a weighted sum of the precisions of first and second moments. As a result, one can gets the criterion according to the size of prior variance against the population variance.

  • PDF

A Comparative Study for Several Bayesian Estimators Under Squared Error Loss Function

  • Kim, Yeong-Hwa
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.2
    • /
    • pp.371-382
    • /
    • 2005
  • The paper compares the performance of some widely used Bayesian estimators such as Bayes estimator, empirical Bayes estimator, constrained Bayes estimator and constrained Bayes estimator by means of a new measurement under squared error loss function for the typical normal-normal situation. The proposed measurement is a weighted sum of the precisions of first and second moments. As a result, one can gets the criterion according to the size of prior variance against the population variance.

  • PDF

Minimax Choice and Convex Combinations of Generalized Pickands Estimator of the Extreme Value Index

  • Yun, Seokhoon
    • Journal of the Korean Statistical Society
    • /
    • v.31 no.3
    • /
    • pp.315-328
    • /
    • 2002
  • As an extension of the well-known Pickands (1975) estimate. for the extreme value index, Yun (2002) introduced a generalized Pickands estimator. This paper searches for a minimax estimator in the sense of minimizing the maximum asymptotic relative efficiency of the Pickands estimator with respect to the generalized one. To reduce the asymptotic variance of the resulting estimator, convex combinations of the minimax estimator are also considered and their asymptotic normality is established. Finally, the optimal combination is determined and proves to be superior to the generalized Pickands estimator.

Shrinkage Estimator of Dispersion of an Inverse Gaussian Distribution

  • Lee, In-Suk;Park, Young-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.3
    • /
    • pp.805-809
    • /
    • 2006
  • In this paper a shrinkage estimator for the measure of dispersion of the inverse Gaussian distribution with known mean is proposed. Also we compare the relative bias and relative efficiency of the proposed estimator with respect to minimum variance unbiased estimator.

  • PDF

A Robust Estimator in Multivariate Regression Using Least Quartile Difference

  • Jung Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.39-46
    • /
    • 2005
  • We propose an equivariant and robust estimator in multivariate regression model based on the least quartile difference (LQD) estimator in univariate regression. We call this estimator as the multivariate least quartile difference (MLQD) estimator. The MLQD estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regressions. The MLQD estimator has high breakdown point as does the univariate LQD estimator. We develop an algorithm for MLQD estimate. Simulations are performed to compare the efficiencies of MLQD estimate with coordinatewise LQD estimate and the multivariate least trimmed squares estimate.

An Equivariant and Robust Estimator in Multivariate Regression Based on Least Trimmed Squares

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1037-1046
    • /
    • 2003
  • We propose an equivariant and robust estimator in multivariate regression model based on the least trimmed squares (LTS) estimator in univariate regression. We call this estimator as multivariate least trimmed squares (MLTS) estimator. The MLTS estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regression. The MLTS estimator has high breakdown point as does LTS estimator in univariate case. We develop an algorithm for MLTS estimate. Simulation are performed to compare the efficiencies of MLTS estimate with coordinatewise LTS estimate and a numerical example is given to illustrate the effectiveness of MLTS estimate in multivariate regression.

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

GENERALIZING THE REFINED PICKANDS ESTIMATOR OF THE EXTREME VALUE INDEX

  • Yun, Seok-Hoon
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.3
    • /
    • pp.339-351
    • /
    • 2004
  • In this paper we generalize and improve the refined Pickands estimator of Drees (1995) for the extreme value index. The finite-sample performance of the refined Pickands estimator is not good particularly when the sample size n is small. For each fixed k = 1,2,..., a new estimator is defined by a convex combination of k different generalized Pickands estimators and its asymptotic normality is established. Optimal weights defining the estimator are also determined to minimize the asymptotic variance of the estimator. Finally, letting k depend upon n, we see that the resulting estimator has a better finite-sample behavior as well as a better asymptotic efficiency than the refined Pickands estimator.