Empirical Inference

The Kernel Mutual Information

2003

Technical Report

ei


We introduce two new functions, the kernel covariance (KC) and the kernel mutual information (KMI), to measure the degree of independence of several continuous random variables. The former is guaranteed to be zero if and only if the random variables are pairwise independent; the latter shares this property, and is in addition an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate. We show that Bach and Jordan‘s kernel generalised variance (KGV) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation. The performance of the KC and KMI is verified in the context of instantaneous independent component analysis (ICA), by recovering both artificial and real (musical) signals following linear mixing.

Author(s): Gretton, A. and Herbrich, R. and Smola, AJ.
Year: 2003
Month: April
Day: 0

Department(s): Empirical Inference
Bibtex Type: Technical Report (techreport)

Institution: Max Planck Institute for Biological Cybernetics

Digital: 0
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PostScript

BibTex

@techreport{2212,
  title = {The Kernel Mutual Information},
  author = {Gretton, A. and Herbrich, R. and Smola, AJ.},
  organization = {Max-Planck-Gesellschaft},
  institution = {Max Planck Institute for Biological Cybernetics},
  school = {Biologische Kybernetik},
  month = apr,
  year = {2003},
  doi = {},
  month_numeric = {4}
}