Chernoff Bound - Matrix Chernoff Bound

Matrix Chernoff Bound

Rudolf Ahlswede and Andreas Winter introduced (Ahlswede & Winter 2003) a Chernoff bound for matrix-valued random variables.

If is distributed according to some distribution over matrices with zero mean, and if are independent copies of then for any ,


\Pr \left( \bigg\Vert \frac{1}{t} \sum_{i=1}^t M_i - \mathbf{E} \bigg\Vert_2 > \varepsilon \right) \leq d \exp \left( -C \frac{\varepsilon^2 t}{\gamma^2} \right).

where holds almost surely and is an absolute constant.

Notice that the number of samples in the inequality depends logarithmically on . In general, unfortunately, such a dependency is inevitable: take for example a diagonal random sign matrix of dimension . The operator norm of the sum of independent samples is precisely the maximum deviation among independent random walks of length . In order to achieve a fixed bound on the maximum deviation with constant probability, it is easy to see that should grow logarithmically with in this scenario.

The following theorem can be obtained by assuming has low rank, in order to avoid the dependency on the dimensions.

Read more about this topic:  Chernoff Bound

Famous quotes containing the words matrix and/or bound:

    In all cultures, the family imprints its members with selfhood. Human experience of identity has two elements; a sense of belonging and a sense of being separate. The laboratory in which these ingredients are mixed and dispensed is the family, the matrix of identity.
    Salvador Minuchin (20th century)

    I live in my wooden legs and O
    my green green hands.
    Too late
    to wish I had not run from youpollo,
    blood moves still in my bark bound veins.
    Anne Sexton (1928–1974)