• Title, Summary, Keyword: Inverse Gaussian distribution

Search Result 44, Processing Time 0.037 seconds

Shrinkage Estimator of Dispersion of an Inverse Gaussian Distribution

  • Lee, In-Suk;Park, Young-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.3
    • /
    • pp.805-809
    • /
    • 2006
  • In this paper a shrinkage estimator for the measure of dispersion of the inverse Gaussian distribution with known mean is proposed. Also we compare the relative bias and relative efficiency of the proposed estimator with respect to minimum variance unbiased estimator.

  • PDF

Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution (역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.6
    • /
    • pp.1271-1284
    • /
    • 2011
  • The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.

A Test of Fit for Inverse Gaussian Distribution Based on the Probability Integration Transformation (확률적분변환에 기초한 역가우스분포에 대한 적합도 검정)

  • Choi, Byungjin
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.4
    • /
    • pp.611-622
    • /
    • 2013
  • Mudholkar and Tian (2002) proposed an entropy-based test of fit for the inverse Gaussian distribution; however, the test can be applied to only the composite hypothesis of the inverse Gaussian distribution with an unknown location parameter. In this paper, we propose an entropy-based goodness-of-fit test for an inverse Gaussian distribution that can be applied to the composite hypothesis of the inverse Gaussian distribution as well as the simple hypothesis of the inverse Gaussian distribution with a specified location parameter. The proposed test is based on the probability integration transformation. The critical values of the test statistic estimated by simulations are presented in a tabular form. A simulation study is performed to compare the proposed test under some selected alternatives with Mudholkar and Tian (2002)'s test in terms of power. The results show that the proposed test has better power than the previous entropy-based test.

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Comparison of parameter estimation methods for normal inverse Gaussian distribution

  • Yoon, Jeongyoen;Kim, Jiyeon;Song, Seongjoo
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.1
    • /
    • pp.97-108
    • /
    • 2020
  • This paper compares several methods for estimating parameters of normal inverse Gaussian distribution. Ordinary maximum likelihood estimation and the method of moment estimation often do not work properly due to restrictions on parameters. We examine the performance of adjusted estimation methods along with the ordinary maximum likelihood estimation and the method of moment estimation by simulation and real data application. We also see the effect of the initial value in estimation methods. The simulation results show that the ordinary maximum likelihood estimator is significantly affected by the initial value; in addition, the adjusted estimators have smaller root mean square error than ordinary estimators as well as less impact on the initial value. With real datasets, we obtain similar results to what we see in simulation studies. Based on the results of simulation and real data application, we suggest using adjusted maximum likelihood estimates with adjusted method of moment estimates as initial values to estimate the parameters of normal inverse Gaussian distribution.

A Graphical Method to Assess Goodness-of-Fit for Inverse Gaussian Distribution (역가우스분포에 대한 적합도 평가를 위한 그래프 방법)

  • Choi, Byungjin
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.1
    • /
    • pp.37-47
    • /
    • 2013
  • A Q-Q plot is an effective and convenient graphical method to assess a distributional assumption of data. The primary step in the construction of a Q-Q plot is to obtain a closed-form expression to represent the relation between observed quantiles and theoretical quantiles to be plotted in order that the points fall near the line y = a + bx. In this paper, we introduce a Q-Q plot to assess goodness-of-fit for inverse Gaussian distribution. The procedure is based on the distributional result that a transformed random variable $Y={\mid}\sqrt{\lambda}(X-{\mu})/{\mu}\sqrt{X}{\mid}$ follows a half-normal distribution with mean 0 and variance 1 when a random variable X has an inverse Gaussian distribution with location parameter ${\mu}$ and scale parameter ${\lambda}$. Simulations are performed to provide a guideline to interpret the pattern of points on the proposed inverse Gaussian Q-Q plot. An illustrative example is provided to show the usefulness of the inverse Gaussian Q-Q plot.

A numerical study of adjusted parameter estimation in normal inverse Gaussian distribution (Normal inverse Gaussian 분포에서 모수추정의 보정 방법 연구)

  • Yoon, Jeongyoen;Song, Seongjoo
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.4
    • /
    • pp.741-752
    • /
    • 2016
  • Numerous studies have shown that normal inverse Gaussian (NIG) distribution adequately fits the empirical return distribution of financial securities. The estimation of parameters can also be done relatively easily, which makes the NIG distribution more useful in financial markets. The maximum likelihood estimation and the method of moments estimation are easy to implement; however, we may encounter a problem in practice when a relationship among the moments is violated. In this paper, we investigate this problem in the parameter estimation and try to find a simple solution through simulations. We examine the effect of our adjusted estimation method with real data: daily log returns of KOSPI, S&P500, FTSE and HANG SENG. We also checked the performance of our method by computing the value at risk of daily log return data. The results show that our method improves the stability of parameter estimation, while it retains a comparable performance in goodness-of-fit.

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.657-667
    • /
    • 2006
  • The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.

Noninformative Priors for the Ratio of Parameters in Inverse Gaussian Distribution (INVERSE GAUSSIAN분포의 모수비에 대한 무정보적 사전분포에 대한 연구)

  • 강상길;김달호;이우동
    • The Korean Journal of Applied Statistics
    • /
    • v.17 no.1
    • /
    • pp.49-60
    • /
    • 2004
  • In this paper, when the observations are distributed as inverse gaussian, we developed the noninformative priors for ratio of the parameters of inverse gaussian distribution. We developed the first order matching prior and proved that the second order matching prior does not exist. It turns out that one-at-a-time reference prior satisfies a first order matching criterion. Some simulation study is performed.