Example
Let X,Y be 2 independent binary random variables with probability of each symbol as one-half. Then
Note that mutual information is then given by
This is because X and Y are independent. The entropic vector is thus
We note that is in as there exists random variables with the entries in the vector as its entropies.
Read more about this topic: Entropic Vector
Famous quotes containing the word example:
“Our intellect is not the most subtle, the most powerful, the most appropriate, instrument for revealing the truth. It is life that, little by little, example by example, permits us to see that what is most important to our heart, or to our mind, is learned not by reasoning but through other agencies. Then it is that the intellect, observing their superiority, abdicates its control to them upon reasoned grounds and agrees to become their collaborator and lackey.”
—Marcel Proust (18711922)