You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.

Discrimination and Classification

Discrimination techniques are methods which use measured variables to put the observations into different classes. The important difference between multiple regression models and discrimination models is that the dependent variable is some kind of class number. In many cases the objects may be members of only two possible classes, which simplifies the discrimination problem to a binary classification. An example would be a good/bad test in quality inspection.

Here is a short list of the more important classification techniques:

    linear discriminant analysis (LDA)
    k-nearest neighbors classification
    logistic regression
    neural network classifiers

The classification problem can be interpreted in a geometrical way: find a proper line (or curve) of separation between two or more groups of similar objects. The type of separating line depends on the method used. A neural network, for instance, could create a non-linear discriminating function, whereas the LDA creates only linear surfaces. This of course implies that not every method is equally well-suited to a particular problem. The classification problem as shown below may be easily solved by a neural network while, for example, LDA would fail.
 



 










Last Update: 2006-Jän-17