Numerical Cruncher



Classification


Parametric Models



Bayes decision theory is a fundamental statistical approach to the problem of pattern classification. This approach is based on the assumption that the decision problem is posed in probabilistic terms, and that all the probability distributions involved are known. Remember that nonparametric techniques exchange the need for knowledge of the probability distributions for the need of a large number of samples.

EXAMPLE: The parallelepipeds method is a simple parametric method. Patterns are classified using just the mean values 'u' and standard deviations 's' of each class. A pattern is supposed to be included in the parallelepipeds determined by 'u±Ks'. The pattern is then assigned to the class corresponding to the parallelepiped with the smallest value of 'K'.

The structure of a Bayes classifier is determined primarily by the probability distributions selected. None has received more attention than the multivariate normal density, due largely to its analytical tractability. However, it is also an appropiate model for modelling classes in a wide range of real-world situations (where the feature vectors for a given class are continuous values, mildly corrupted versions of a single prototype vector). When the multivariate normal density function is used, you obtain:

The second chapter of Duda and Hart's book ["Pattern Classification and Scene Analysis", John Wiley & Sons, 1973] is devoted to Bayes decision theory, with special emphasis on the solution for the multivariate normal case.