You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.

Generalization and Overtraining

The problem with any modeling method which does not need any assumption about the type of model ("model-free methods") is that these models tend to adapt to any data - even noise - if they are used in the wrong way. In the specific case of neural networks, this effect is called overtraining or overfitting. Overtraining occurs if the neural network is too powerful for the current problem. It then does not "recognize" the underlying trend in the data, but learns the data by heart (including the noise in the data). This results in poor generalization and too good a fit to the training data. Click on this  to get an impression of the adversive effects of overtraining.

As you can see from the interactive example above, good generalization is quite important for useful models. There are several methods available to check the degree of generalization and/or to detect overfitting:
 


Last Update: 2006-Jän-17