Entropic Vector - Example

Example

Let X,Y be 2 independent binary random variables with probability of each symbol as one-half. Then


H \left (X \right ) = H(Y) = 1, H(X,Y) = 2

Note that mutual information is then given by


I \left (X;Y \right ) = H(X) + H(Y) - H(X,Y) = 0

This is because X and Y are independent. The entropic vector is thus


v = \left ( 1,1,2 \right )^T

We note that is in as there exists random variables with the entries in the vector as its entropies.

Read more about this topic:  Entropic Vector

Famous quotes containing the word example:

    Our intellect is not the most subtle, the most powerful, the most appropriate, instrument for revealing the truth. It is life that, little by little, example by example, permits us to see that what is most important to our heart, or to our mind, is learned not by reasoning but through other agencies. Then it is that the intellect, observing their superiority, abdicates its control to them upon reasoned grounds and agrees to become their collaborator and lackey.
    Marcel Proust (1871–1922)