Inequalities in Information Theory - Shannon-type Inequalities

Shannon-type Inequalities

Consider a finite collection of finitely (or at most countably) supported random variables on the same probability space. For a collection of n random variables, there are 2n − 1 such non-empty subsets for which entropies can be defined. For example, when n = 2, we may consider the entropies and and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables):

In fact, these can all be expressed as special cases of a single inequality involving the conditional mutual information, namely

where, and each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of random variables. Inequalities that can be derived from this are known as Shannon-type inequalities. More formally, (following the notation of Yeung), define to be the set of all constructible points in where a point is said to be constructible if and only if there is a joint, discrete distribution of n random variables such that each coordinate of that point, indexed by a non-empty subset of {1, 2, ..., n}, is equal to the joint entropy of the corresponding subset of the n random variables. The closure of is denoted

The cone in characterized by all Shannon-type inequalities among n random variables is denoted Software has been developed to automate the task of proving such inequalities. Given an inequality, such software is able to determine whether the given inequality contains the cone in which case the inequality can be verified, since

Read more about this topic:  Inequalities In Information Theory

Famous quotes containing the word inequalities:

    In many places the road was in that condition called repaired, having just been whittled into the required semicylindrical form with the shovel and scraper, with all the softest inequalities in the middle, like a hog’s back with the bristles up.
    Henry David Thoreau (1817–1862)