V. V. Fedorov, Theory of Optimal Experiments, 1972.

J. Gauchi, Plans d'expériences optimaux pour modèles linéaires, Plans d'expériences-Applications à l'entreprise

J. Kiefer, Optimum experimental designs, J. Roy. Statist. Soc, vol.21, pp.272-319, 1959.

J. Kiefer and J. Wolfowitz, Optimum designs in regression problems, Ann. Math. Statist, vol.30, pp.271-294, 1959.

H. P. Wynn, The sequential generation of D-optimum experimental designs, Ann. Math. Statist, vol.41, pp.1655-1664, 1970.

J. Vila, Local optimality of replications from a minimal D-optimal design in regression: A sufficient and a quasi-necessary condition, J. Statist. Planning Inference, vol.29, pp.261-277, 1991.

D. Mackay, Information-based objective functions for active data selection, Neural Comput, vol.4, pp.590-604, 1992.

D. Cohn, Neural networks exploration using optimal experiment design, Advances in Neural Information Processing Systems. Cambridge, vol.6, 1994.

S. Issanchou and J. Gauchi, Plans d'expériences optimaux pour réseaux de neurones, 2004.

M. Witczak, Toward the training of feed-forward neural networks with the D-optimum input sequence, IEEE Trans. Neural Netw, vol.17, issue.2, pp.357-373, 2006.

C. R. Rao and H. Toutenburg, Linear Models-Least Squares and Alternatives, ser. Springer Series in Statistics

T. J. Mitchell, An algorithm for the construction of D-optimal experimental designs, Technometrics, vol.16, pp.203-210, 1974.

D. M. Bates and D. G. Watts, Nonlinear regression analysis and its applications, 1988.

G. Monari and G. Dreyfus, Local overfiting control via leverages, Neural Comput, vol.14, pp.1481-1506, 2002.

R. L. Iman, J. C. Helton, and J. E. Campbell, An approach to sensitivity analysis of computer models, Part I. Introduction, input variable selection and preliminary variable assessment, J. Quality Technol, vol.13, pp.174-183

A. C. Atkinson and A. N. Donev, The construction of exact D-optimal designs with application in blocking response surface designs, Biometrika, vol.76, pp.515-526, 1989.

D. Cohn, L. Atlas, and R. Ladner, Improving generalization with active learning, Mach. Learn, vol.15, pp.201-221, 1994.

P. Melville and R. Mooney, Diverse ensembles for active learning, Proc. 21st Int. Conf, pp.584-591, 2004.

S. Thrun and K. Möller, Active exploration in dynamic environments, Advances in Neural Information Processing Systems, vol.4, 1992.

G. Schohn and D. Cohn, Less is more: Active learning with support vector machines, Proc. 17th Int. Conf. Mach. Learn, pp.839-846, 2000.

K. K. Sung and P. Niyogi, Active learning for function approximation, Advances in Neural Information Processing Systems, vol.7, 1995.

L. Breiman, Bagging predictors, Mach. Learn, vol.24, issue.2, pp.123-140, 1996.

B. Efron and R. Tibshirani, An Introduction to the Bootstrap, 1993.

A. Saltelli and T. Homma, Sensitivity analysis for model output, performances of black box techniques on three international benchmark exercises, Comput. Statist. Data Anal, vol.13, pp.73-94, 1992.

L. A. Feldkamp, D. V. Prokhorov, and C. F. Eagen, Multiple-start directed search for improved NN solution, Proc. Int. Joint Conf. Neural Netw, pp.991-996, 2004.

H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, 1992.