• Title/Summary/Keyword: local polynomial regression

Search Result 25, Processing Time 0.018 seconds

Selection of Data-adaptive Polynomial Order in Local Polynomial Nonparametric Regression

  • Jo, Jae-Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.177-183
    • /
    • 1997
  • A data-adaptive order selection procedure is proposed for local polynomial nonparametric regression. For each given polynomial order, bias and variance are estimated and the adaptive polynomial order that has the smallest estimated mean squared error is selected locally at each location point. To estimate mean squared error, empirical bias estimate of Ruppert (1995) and local polynomial variance estimate of Ruppert, Wand, Wand, Holst and Hossjer (1995) are used. Since the proposed method does not require fitting polynomial model of order higher than the model order, it is simpler than the order selection method proposed by Fan and Gijbels (1995b).

  • PDF

On Convex Combination of Local Constant Regression

  • Mun, Jung-Won;Kim, Choong-Rak
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.2
    • /
    • pp.379-387
    • /
    • 2006
  • Local polynomial regression is widely used because of good properties such as such as the adaptation to various types of designs, the absence of boundary effects and minimax efficiency Choi and Hall (1998) proposed an estimator of regression function using a convex combination idea. They showed that a convex combination of three local linear estimators produces an estimator which has the same order of bias as a local cubic smoother. In this paper we suggest another estimator of regression function based on a convex combination of five local constant estimates. It turned out that this estimator has the same order of bias as a local cubic smoother.

ON MARGINAL INTEGRATION METHOD IN NONPARAMETRIC REGRESSION

  • Lee, Young-Kyung
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.4
    • /
    • pp.435-447
    • /
    • 2004
  • In additive nonparametric regression, Linton and Nielsen (1995) showed that the marginal integration when applied to the local linear smoother produces a rate-optimal estimator of each univariate component function for the case where the dimension of the predictor is two. In this paper we give new formulas for the bias and variance of the marginal integration regression estimators which are valid for boundary areas as well as fixed interior points, and show the local linear marginal integration estimator is in fact rate-optimal when the dimension of the predictor is less than or equal to four. We extend the results to the case of the local polynomial smoother, too.

A Study on Kernel Type Discontinuity Point Estimations

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.4
    • /
    • pp.929-937
    • /
    • 2003
  • Kernel type estimations of discontinuity point at an unknown location in regression function or its derivatives have been developed. It is known that the discontinuity point estimator based on $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a zero value at the point 0 makes a poor asymptotic behavior. Further, the asymptotic variance of $Gasser-M\ddot{u}ller$ regression estimator in the random design case is 1.5 times larger that the one in the corresponding fixed design case, while those two are identical for the local polynomial regression estimator. Although $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a non-zero value at the point 0 for the modification is used, computer simulation show that this phenomenon is also appeared in the discontinuity point estimation.

  • PDF

On variable bandwidth Kernel Regression Estimation (변수평활량을 이용한 커널회귀함수 추정)

  • Seog, Kyung-Ha;Chung, Sung-Suk;Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.9 no.2
    • /
    • pp.179-188
    • /
    • 1998
  • Local polynomial regression estimation is the most popular one among kernel type regression estimator. In local polynomial regression function esimation bandwidth selection is crucial problem like the kernel estimation. When the regression curve has complicated structure variable bandwidth selection will be appropriate. In this paper, we propose a variable bandwidth selection method fully data driven. We will choose the bandwdith by selecting minimising estiamted MSE which is estimated by the pilot bandwidth study via croos-validation method. Monte carlo simulation was conducted in order to show the superiority of proposed bandwidth selection method.

  • PDF

Study on semi-supervised local constant regression estimation

  • Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.3
    • /
    • pp.579-585
    • /
    • 2012
  • Many different semi-supervised learning algorithms have been proposed for use wit unlabeled data. However, most of them focus on classification problems. In this paper we propose a semi-supervised regression algorithm called the semi-supervised local constant estimator (SSLCE), based on the local constant estimator (LCE), and reveal the asymptotic properties of SSLCE. We also show that the SSLCE has a faster convergence rate than that of the LCE when a well chosen weighting factor is employed. Our experiment with synthetic data shows that the SSLCE can improve performance with unlabeled data, and we recommend its use with the proper size of unlabeled data.

Estimation of Density via Local Polynomial Tegression

  • Park, B. U.;Kim, W. C.;J. Huh;J. W. Jeon
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.1
    • /
    • pp.91-100
    • /
    • 1998
  • A method of estimating probability density using regression tools is presented here. It is based on equal-length binning and locally weighted approximate likelihood for bin counts. The method is particularly useful for densities with bounded supports, where it automatically corrects edge effects without using boundary kernels.

  • PDF

Robust Nonparametric Regression Method using Rank Transformation

    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.574-574
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

Robust Nonparametric Regression Method using Rank Transformation

  • Park, Dongryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.575-583
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

  • PDF

Bandwidth Selection for Local Smoothing Jump Detector

  • Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.1047-1054
    • /
    • 2009
  • Local smoothing jump detection procedure is a popular method for detecting jump locations and the performance of the jump detector heavily depends on the choice of the bandwidth. However, little work has been done on this issue. In this paper, we propose the bootstrap bandwidth selection method which can be used for any kernel-based or local polynomial-based jump detector. The proposed bandwidth selection method is fully data-adaptive and its performance is evaluated through a simulation study and a real data example.