This paper proposes an approximation for the Kullback-Leibler information
based on Edgeworth expansions.
In information theory, entropy is a useful criterion for identifying a
multivariate normal distribution. Comon (1994) proposed an
Edgeworth-based expansion of neg-entropy in the univariate case.
Based on the Edgeworth expansion of neg-entropy, a diagnosis is proposed here
for checking multi-normality. Moreover, a measurement for Kullback-Leibler
information is also proposed. We present numerical examples to demonstrate
computational complexity and applications to diagnose multivariate
normality, evaluate the differential entropy and choose the
least statistically dependent basis
from the wavelet packet dictionaries.
Keywords: Neg-entropy, differential entropy, cumulants,
multivariate normal diagnostic, least statistically dependent basis,
wavelet packet dictionary
Get the full paper: gzipped PS file or PDF file.
Please email
me if you have any comments or questions!
Go
back to Naoki's Publication Page