Referencias
Akritas, M. G. (1986). Bootstrapping the Kaplan–Meier estimator. J. Amer. Stat. Assoc. 81, 1032–1038.
Beran, R. (1987). Prepivoting to reduce level error of confidence sets. Biometrika 74, 457-468.
Bhattacharya, R.N. and Ghosh, J.K. (1978). On the validity of the formal Edgeworth expansion. Ann. Statist. 6, 434–451.
Bickel, P.J. and Freedman, D.A. (1981). Some Asymptotic theory for the bootstrap. Ann. Statist. 12 2, 470-482.
Bock M., Bowman A.W. and Ismail B. (2007). Estimation and inference for error variance in bivariate nonparametric regression. Statistics and Computing 17, 39-47.
Bowman A.W. and Azzalini A. (2019). R package ‘sm’: nonparametric smoothing methods (version 2.2). http://www.stats.gla.ac.uk/~adrian/sm.
Bowman, A., Hall, P. and Prvan, T. (1998). Bandwidth selection for the smoothing of distribution functions. Biometrika 85 4, 799-808.
Breslow, N. and Crowley, J. (1974). A large sample study of the life table and product limit estimates under random censorship. Ann. Statist. 2, 437–453.
Bühlmann, P. (1994). Blockwise bootstrap empirical processes for stationary sequences. Ann. Statist. 22, 995-1012.
Bühlmann, P. (1997). Sieve bootstrap for time series. Bernoulli 3, 123-148.
Bühlmann, P. (1998). Sieve bootstrap for smoothing in nonstationary time series. Ann. Statist. 26, 48-83.
Canty, A. J. (2002). Resampling methods in R: the boot package. The Newsletter of the R Project 2, 2-7.
Cao, R. (1990). Órdenes de convergencia para las aproximaciones normal y bootstrap en la estimación no paramétrica de la función de densidad. Trabajos de Estadística, vol. 5, 2, 23-32.
Cao, R. (1991). Rate of convergence for the wild bootstrap in nonparametric regression. Ann. Statist 19, 2226-2231.
Cao, R. (1993). Bootstrapping the mean integrated squared error. Jr. Mult. Anal. 45, 137–160.
Cao, R. (1999). An overview of bootstrap methods for estimating and predicting in time series. Test 8, 95-116.
Cao, R., Cuevas, A. and González-Manteiga, W. (1993). A comparative study of several smoothing methods in density estimation. Comp. Statist. Data Anal. 17, 153–176.
Cao, R., Febrero-Bande, M., González-Manteiga, W., Prada-Sánchez, J.M. and García-Jurado, I. (1997). Saving computer time in constructing consistent bootstrap prediction intervals for autoregressive processes. Commun. Statist. Simula. 26, 961-978.
Cao, R. and González-Manteiga, W. (1993). Bootstrap methods in regression smoothing. Journal of Nonparametric Statistics 2, 379-388.
Cao, R. and Prada-Sánchez, J.M. (1993). Bootstrapping the mean of a symmetric population. Statistics & Probability Letters 17, 43-48.
Castillo-Páez, S., Fernández-Casal, R. and García-Soidán, P. (2019). A nonparametric bootstrap method for spatial data, Computational Statistics and Data Analysis 137, 1-15.
Castillo-Páez S., Fernández-Casal R. and García-Soidán P. (2020). Nonparametric bootstrap approach for unconditional risk mapping under heteroscedasticity. Spatial Statistics. In Press.
Carlstein, E., Do, K. A., Hall, P., Hesterberg, T., and Künsch, H. R. (1998). Matched-block bootstrap for dependent data . Bernoulli 4 3, 305-328.
Cox, D. R. (1972). Regression Models and Life Tables (with discussion), Journal of the Royal Statistical Society series B 34, 187–220.
Davison, A.C. and Hinkley, D.V. (1997). Bootstrap Methods and their Application. Cambridge University Press.
Efron, B. (1979). Bootstrap Methods: Another look at the Jackknife. Ann. Statist. 7, 1-26.
Efron, B. (1981). Censored data and the bootstrap. J. Amer. Statist. Assoc. 76, 312–319.
Efron, B. (1982). The Jackknife, the Bootstrap and other Resampling Plans. CBMS-NSF. Regional Conference series in applied mathematics.
Efron, B. (1983). Estimating the error rate of a prediction rule: improvements on cross-validation. J. Amer. Stat. Assoc. 78, 316-331.
Efron, B. (1987). Better Bootstrap confidence intervals (with discussion). J. Amer. Stat. Assoc. 82, 171-200.
Efron, B. (1990). More Efficient Bootstrap Computations. J. Amer. Statist. Assoc. 85, 79-89.
Efron, B. and Tibshirani, R. (1986). Bootstrap methods for standard errors, confidence intervals, and other measures of statistical accuracy. Statistical Science 1, 54-77.
Efron, B. and Tibshirani, R.J. (1993). An Introduction to the Bootstrap. Chapman and Hall.
Fan J. and Gijbels I. (1996). Local Polynomial Modelling and Its Applications. Chapman and Hall, London.
Fan J. and Yao Q. (1998). Efficient estimation of conditional variance functions in stochastic regression. Biometrika, 85, 645–660.
Faraway, J.J. and Jhun, M. (1990). Bootstrap choice of bandwidth for density estimation. Jr. Amer. Statist. Assoc. 85, 1119–1122.
Fernández-Casal, R. y Cao, R. (2020). Simulación Estadística. https://rubenfcasal.github.io/simbook.
Ferretti, N. and Romo, J. (1996). Unit root bootstrap test for AR(1) models. Biometrika 83, 849-860.
Fisher, R.A. (1935). The design of experiments. Edinburgh: Oliver and Boyd.
Fox, J., and Weisberg, S. (2018). An R companion to applied regression. Sage publications.
Freedman, D.A. (1981). Bootstrapping regression models. Ann. Statis. 9 6, 118-1228.
Fuller, W.A. (1976). Introduction to statistical time series. New York: Wiley.
Gannoun, A. (1990). Estimation non paramétrique de la médiane conditionnelle. Application à la prévision. C. R. Acad. Sci. Paris 310, 295-298.
García-Jurado, I. González-Manteiga, W., Prada-Sánchez, J.M., Febrero-Bande, M. and Cao, R. (1995). Predicting using Box-Jenkins, nonparametric and bootstrap techniques. Technometrics 37, 303-310.
González-Manteiga, W., and Cao, R. (1993). Testing the hypothesis of a general linear model using nonparametric regression estimation. Test, 2, 161-188.
González-Manteiga, W. and Prada-Sánchez, J.M. (1985). Una aplicación de los métodos de suavización no paramétricos en la técnica bootstrap. Proceedings Jornadas Hispano-Lusas de Matemáticas. Murcia, 1985.
González-Manteiga, W., Prada-Sánchez, J.M. and Romo, J. (1994). The Bootstrap-A Review. Computational Statistics 9, 165-205.
Hall, P. (1986). On the bootstrap and confidence intervals. Ann. Statist. 14, 1431-1452.
Hall, P. (1988-a) Theoretical comparison of bootstrap confidence intervals. Ann. Statist. 16, 927-953.
Hall, P. (1988-b). Rate of convergence in bootstrap approximations. Ann. Probab. 16 4, 1665-1684.
Hall, P. (1990). Using the bootstrap to estimate mean squared error and select smoothing parameter in nonparametric problems. J. Multivariate Anal. 32, 177–203.
Hall. P. (1992). The Bootstrap and Edgeworth Expansion. Springer Verlag.
Hall, P., Horowitz, J.L. and Jing, B-Y. (1995). On blocking rules for the bootstrap with dependent data. Biometrika 82, 561-574.
Hall, P., Marron, J.S. and Park, B. (1992). Smoothed cross-validation. Probab. Theor. Rel. Fields 92, 1–20.
Hall, P. and Martin, M.A. (1988). On bootstrap resampling and iteration. Biometrika 75, 661-671.
Härdle, W. and Mammen, E. (1993). Comparing nonparametric versus parametric regression fits. Ann. Statist. 21, 1926-1947.
Härdle, W. and Marron, J. S. (1991). Bootstrap simultaneous error bars for nonparametric regression. Ann. Statist. 19, 778–796.
Hartigan, J.A. (1969). Using subsample values as typical values. J. Amer. Statist. Assoc. 64, 1303-1317.
Heimann, G. and Kreiss, J-P. (1996). Bootstrapping general first order autoregression. Statist. Prob. Lett. 30, 87-98.
Kaplan, E. L. and P. Meier, Nonparametric estimation from incomplete observations, J. Amer. Stat. Assoc. 53 (1958) 457–481.
Kreiss, J-P. and Franke, J. (1992). Bootstrapping stationary autoregressive moving average models. J. Time Ser. Anal. 13, 297-317.
Künsch, H.R. (1989). The jackknife and the bootstrap for general stationary observations. Ann. Statist. 17, 1217-1241.
Liu, R.Y. and Singh, K. (1992). Moving blocks jackknife and bootstrap capture weak dependence. In Exploring the limits of bootstrap (R. LePage and L Billard, Eds.), pp. 225-248. New York: Wiley.
Mammen, E. (1992). When does Bootstrap Work?. Springer Verlag.
Maritz, J.S. (1979). A note on exact robust confidence intervals for location. Biometrika 66, 163-166.
Marron, J.S. (1992). Bootstrap bandwidth selection. In Exploring the limits of the bootstrap, LePage, R. and Billard, L. eds., pp. 249–262. New York: Wiley.
Nadaraya, E.A. (1964). On estimating regression. Theor. Probab. Appl. 9, 141-142.
Naik-Nimbalkar, U.V. and Rajarshi, M.B. (1994). Validity of blockwise bootstrap for empirical processes with stationary observations. Ann. Statist. 22, 980-994.
Navidi, W. (1989). Edgeworth expansions for bootstrapping regression models. Ann. Statist. 17 4, 1472-1478.
Paparoditis, E. (1996). Bootstrapping autoregressive and moving average parameters estimates of infinite order vector autoregressive processes. J. Mult. Anal. 57, 277-296.
Parzen, E. (1962). On estimation of a probability density function and mode. Ann. Math. Statist. 33, 1065–1076.
Pascual, L., Romo, J. and Ruiz, E. (2001). Effects of parameter estimation on prediction densities: a bootstrap approach. Int. J. Forecasting 17, 83-103
Pitman, E.J.G. (1937). Significance tests which may be applied to samples from any populations. Journal of the Royal Statistical Society, Series B, 4, 119–130.
Politis, D.N. and Romano, J.R. (1994a). The stationary bootstrap. J. Amer. Statist. Assoc. 89, 1303-1313.
Politis, D.N. and Romano, J.R. (1994b). Large sample confidence regions based on subsamples under minimal assumptions. Ann. Statist. 22, 2031-2050.
Politis, D.N. and Romano, J.R. (1994c). Limit theorems for weakly dependent Hilbert space valued random variables with application to the stationary bootstrap. Statist. Sin. 4, 461-476.
Politis, D.N., Romano, J.P. and Wolf, M. (1999). Subsampling. Springer Verlag.
Prada-Sánchez, J.M. and Cotos-Yáñez, T. (1997). A Simulation Study of Iterated and Non-iterated Bootstrap Methods for Bias Reduction and Confidence Interval Estimation. Comm. Statist .-Simula. 26 3, 927-946.
Prada-Sánchez, J.M. and Otero-Cepeda, X.L. (1989). The use of smooth bootstrap techniques for estimating the error rate of a prediction rule. Comm. Statist .-Simula. 18 3, 1169-1186.
Quenouille, M. (1949). Approximate test of correlation in time series. J. Roy. Statist. Soc. Ser. B 11, 18-84.
Radulović, D. (1996). The bootstrap for the mean of strong mixing sequences under minimal conditions. Statist. Prob. Lett. 28, 65-72.
Reid, N. (1981). Estimating the median survival time. Biometrika 68, 601–608.
Rizzo, M.L. (2008). Statistical Computing with R. Chapman&Hall/CRC
Rosenblatt, M. (1956). Remarks on some nonparametric estimate of a density function. Ann. Math. Statist. 27, 832–837.
Rubin, D.B. (1981). The Bayesian Bootstrap. Ann. Statist. 9 1, 130-134.
Rubinstein, R.Y. (1981). Simulation and the Monte Carlo Method. Wiley.
Schucany, W., Gray, H. and Owen, O. (1971). On bias reduction in estimation. J. Amer. Stat. Assoc. 66, 524-533.
Shao, J. (1999). Mathematical Statistics. Springer.
Shao, J. (2006). Mathematical Statistics: exercises and solutions. Springer.
Sheather, S.J. and Jones, M.C. (1991). A reliable data-based bandwidth selection method for kernel density estimation. Jr. Royal Statist. Soc. Ser. B 53, 683–690.
Silverman, B.W. (1986). Density Estimation. Chapman and Hall.
Shao, J. and Tu, D. (1995). The Jackknife and Bootstrap. Springer Verlag.
Sing, K. (1981). On the asymptotic accuracy of Efron’s bootstrap. Ann. Statist. 9 6, 1187-1195.
Stine, R.A. (1987). Estimating properties of autoregressive forecasts. J. Amer. Stat. Assoc. 82, 1072-1078.
Taylor, C. C. (1989). Bootstrap choice of the smoothing parameter in kernel density estimation. Biometrika 76, 705–712.
Thombs, L.A. and Schucany, W.R. (1990). Bootstrap prediction intervals for autoregression. J. Amer. Stat. Assoc. 85, 486-492.
Tukey, J. (1958). Bias and confidence in not quite large samples, abstract, Ann. Math. Statist. 29, 614.
Wand M.P. and Jones M.C. (1995) Kernel Smoothing. Chapman and Hall, London.
Watson, G.S. (1964). Smooth regression analysis. Sankhyā: The Indian Journal of Statistics Ser. A 26, 359-372.
Welch, B.L. (1937). On the z‐test in randomized blocks and Latin squares. Biometrika, 29, 21–52.
Wu, C.-F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis. Ann. Statist. 14, 1261–1350.