Advanced SearchSearch Tips
Tree-Structured Nonlinear Regression
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Tree-Structured Nonlinear Regression
Chang, Young-Jae; Kim, Hyeon-Soo;
  PDF(new window)
Tree algorithms have been widely developed for regression problems. One of the good features of a regression tree is the flexibility of fitting because it can correctly capture the nonlinearity of data well. Especially, data with sudden structural breaks such as the price of oil and exchange rates could be fitted well with a simple mixture of a few piecewise linear regression models. Now that split points are determined by chi-squared statistics related with residuals from fitting piecewise linear models and the split variable is chosen by an objective criterion, we can get a quite reasonable fitting result which goes in line with the visual interpretation of data. The piecewise linear regression by a regression tree can be used as a good fitting method, and can be applied to a dataset with much fluctuation.
Regression tree;nonlinearity;piecewise regression;GUIDE;
 Cited by
Estimation of Nonlinear Impulse Responses of Stock Indices by Asset Class,Chang, Young-Jae;

응용통계연구, 2012. vol.25. 2, pp.239-249 crossref(new window)
회귀나무 모형을 이용한 패널데이터 분석,장영재;

Journal of the Korean Data and Information Science Society, 2014. vol.25. 6, pp.1253-1262 crossref(new window)
An analysis of changes in the influence of GDP gap on inflation, Journal of the Korean Data and Information Science Society, 2015, 26, 6, 1377  crossref(new windwow)
Panel data analysis with regression trees, Journal of the Korean Data and Information Science Society, 2014, 25, 6, 1253  crossref(new windwow)
Estimation of Nonlinear Impulse Responses of Stock Indices by Asset Class, Korean Journal of Applied Statistics, 2012, 25, 2, 239  crossref(new windwow)
Breiman, L. (1996). Bagging predictors, Machine Learning, 24, 123-140.

Breiman, L. (2001). Random Forests, Machine Learning, 45, 5-32. crossref(new window)

Breiman, L., Friedman, J., Stone, C. and Olshen, R. A. (1984). Classification and Regression Trees, 1st Edition, Chapman & Hall/CRC.

Chang, Y. (2010). The analysis of factors which affect business survey index using regression trees, The Korean Journal of Applied Statistics, 23, 63-71. crossref(new window)

Kim, H., Loh, W.-Y., Shih, Y.-S. and Chaudhuri, P. (2006). A visualizable and interpretable regression model with good prediction power, IIE Transactions, Special Issue on Data Mining and Web Mining.

Loh, W.-Y. (2002). Regression trees with unbiased variable selection and interaction detection, Statistica Sinica, 12, 361-386.

Loh, W.-Y. (2008). Regression by parts: Fitting visually interpretable models with GUIDE, In Handbook of Data Visualization, C. Chen, W. Hardle, and A. Unwin, Eds. Springer, 447-469.

Strobl, C., Boulesteix, A.-L., Zeileis, A. and Hothorn, T. (2007). Bias in random forest variable importance measures: Illustrations, sources and a solution, BMC Bioinformatics, 8, 25. crossref(new window)