Advanced SearchSearch Tips
Smoothing parameter selection in semi-supervised learning
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Smoothing parameter selection in semi-supervised learning
Seok, Kyungha;
  PDF(new window)
Semi-supervised learning makes it easy to use an unlabeled data in the supervised learning such as classification. Applying the semi-supervised learning on the regression analysis, we propose two methods for a better regression function estimation. The proposed methods have been assumed different marginal densities of independent variables and different smoothing parameters in unlabeled and labeled data. We shows that the overfitted pilot estimator should be used to achieve the fastest convergence rate and unlabeled data may help to improve the convergence rate with well estimated smoothing parameters. We also find the conditions of smoothing parameters to achieve optimal convergence rate.
Asymptotic mean integrated squared error;convergence rate;kernel regression;Nadaraya-Watson estimator;semi-supervised regression;smoothing parameter;
 Cited by
Belkin, M., Niyogi, P. and Sindhwani, V. (2006). Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. The Journal of Machine Learning Research, 7, 2399-2434.

Chapelle, O., Scholkopf, B. and Zien, A. (2006). Semi-supervised learning, MIT Press, Cambridge, MA.

Cortes, C. and Mohri, M. (2007). On transductive regression. Advances in Neural Information Processing System, 19, 305-312.

Liu, B., Jing, L., Yu, J. and Jia L. (2014). Constrained least squares regression for semi-supervised learning. In Advances in Knowledge Discovery and Data Mining, 8444, 110-121.

Lafferty, J. and Wasserman, L. (2008). Statistical analysis of semi-supervised regression. In Advances in Neural Information Processing Systems, 20, 801-808.

Nadaraya, E. A. (1964). On estimating regression. Theory of Probability and its Applications, 9, 141-142. crossref(new window)

Niyogi, P. (2008). Manifold regularization and semi-supervised learning: Some theoretical analyses, Technical Report TR-2008-01, Computer science department, University of Chicago, Chicago, IL.

Seok, K. (2012). Study on semi-supervised local constant regression estimation. Journal of the Korean Data & Information Science Society, 23, 579-585. crossref(new window)

Seok, K. (2013). A study on semi-supervised kernel ridge regression estimation. Journal of the Korean Data & Information Science Society, 24, 341-353. crossref(new window)

Seok, K. (2015). Semisupervised support vector quantile regression. Journal of the Korean Data & Information Science Society, 26, 517-524. crossref(new window)

Suykens, J.A.K., Gastel, T. V., Bravanter, J. D., Moore, B. D. and Vandewalle, J. (2002). Least squares support vector machines, World Scientific, London.

Wang, M., Hua, X., Song, Y., Dai, L. and Zhang, H. (2006). Semi-supervised kernel regression. In Proceeding of the Sixth International Conference on Data Mining, 1130-1135.

Wasserman, L. (2006). All of nonparametric statistics, Springer, New York.

Watson, G. S. (1964). Smooth regression analysis. Sankhya: The Indian Journal of StatisticsA, 26, 359-372.

Wei, R., Pan, L. and Guo, L. (2015). Semi-supervised learning via nonnegative least squares regression. In Proceedings of the 7th International Conference on Internet Multimedia Computing and Service, 15, 105-116.

Xu, S., An. X., Qiao, X., Zhu, L. and Li, L. (2011). Semisupervised least squares support vector regression machines. Journal of Information & Computational Science, 8, 885-892.

Xu, Z., King, I. and Lyu, M. R. (2010). More than semi-supervised learning, LAP LAMBERT Academic Publishing, London.

Zhu, D. (2005). Semi-supervised learning literature survey, Technical Report, Computer Sciences Department, University of Wisconsin, Madison, WI.

Zhu, X. and Goldberg, A. (2009). Introduction to semi-supervised learning, Morgan & Claypool, London.