Bootstrap Estimate of Prediction Error of Simple Linear Regression Models
Regression analysis is one of the necessary strategies utilized in statistical inferences, that is employed to estimate the relationship between variables. One way to measure the efficiency of the regression model is to estimate the prediction error, the best model is to have the lowest prediction error. During this paper we are going to estimate the prediction error using bootstrap methods, we will use two different bootstrap methods, Efron’s bootstrap and Banks’ bootstrap methods. They are resampling strategies but in a different manner. We will review them later thoroughly during this paper. We will find that Banks’ bootstrap will be a good choice in most cases.
J. Neter, M.H. kutner, C.J. Nachtsheim, W. Wasserman. Applied linear statistical models. McGraw-Hill, 1996.
B. Efron. “Bootstrap methods: Another look at the jackknife.” The Annals of Statistics, vol. 7, pp.1-26, 1979.
D.L. Banks. “Histospline smoothing the Bayesian bootstrap.” Biometrika., vol. 4, pp. 673-684, 1988.
A.J. Scott,”illusions in regression analysis.” International Journal of forecasting, vol. 28, pp. 689-694, 2012.
B. Efron. “More efficient bootstrap computations.” Journal of the American statistical association, vol. 85, pp. 79-89,1990.
B. Efron, G. Gong. “A leisurely look at the bootstrap, the jackknife, and cross validation.” The American statistician, vol. 37, pp. 36-48, 1983.
B. Efron, R.J. Tibshirani. An Introduction to the Bootstrap. Chapman and Hall, 1993.
A.C. Davison, D.V. Hinkley. Bootstrap Methods and their applications. Campridge University Press, 1997.
- There are currently no refbacks.