Information Bottleneck Method - Gaussian Information Bottleneck

Gaussian Information Bottleneck

A relatively simple application of the information bottleneck is to Gaussian variates and this has some semblance to a least squares reduced rank or canonical correlation. Assume are jointly multivariate zero mean normal vectors with covariances and is a compressed version of which must maintain a given value of mutual information with . It can be shown that the optimum is a normal vector consisting of linear combinations of the elements of where matrix has orthogonal rows.

The projection matrix in fact contains rows selected from the weighted left eigenvectors of the singular value decomposition of the following matrix (generally asymmetric)

Define the singular value decomposition

and the critical values

then the number of active eigenvectors in the projection, or order of approximation, is given by

And we finally get

In which the weights are given by

where

Applying the Gaussian information bottleneck on time series, one gets optimal predictive coding. This procedure is formally equivalent to linear Slow Feature Analysis . Optimal temporal structures in linear dynamic systems can be revealed in the so-called past-future information bottleneck .

Read more about this topic:  Information Bottleneck Method

Famous quotes containing the word information:

    I was brought up to believe that the only thing worth doing was to add to the sum of accurate information in the world.
    Margaret Mead (1901–1978)