The authors previously developed the so-called Local Discriminant Basis (LDB)
method for signal and image classification problems.
The original LDB method relies on differences in the time-frequency energy
distribution of each class: it selects the subspaces where these energy
distributions are well separated by some measure such as the Kullback-Leibler
divergence.
Through our experience and experiments on various datasets,
however, we realized that the time-frequency energy distribution is not always
the best quantity to analyze for classification.
In this paper, we propose to use the discrimination of coordinates based, instead, on empirical probability densities.
That is, we estimate the probability density of each class in each coordinate
in the wavelet packet/local trigonometric bases after expanding signals into
such bases. We then evaluate a power of discrimination of each subspace by
selecting the K most discriminant coordinates in terms of the
"distance" among the corresponding densities (e.g., by the Kullback-Leibler
divergence among the densities). This information is then used for
selecting a basis for classification.
We will demonstrate the capability of this algorithm using both synthetic
and real datasets.
Keywords: Local feature extraction, pattern classification, density estimation, Kullback-Leibler divergence, Hellinger distance.
Get the full paper: PDF file.
Get the official version via doi:10.1016/S0031-3203(02)00019-5.
Please email
me if you have any comments or questions!
Go
back to Naoki's Publication Page