Entropic Vector - Definition

Definition

Consider n jointly distributed random variables with a joint probability density function . Let be a subset of . Now we define where . Clearly there are 2n−1 non-empty subsets of . Corresponding to each, we have the joint entropy defined as . A vector in consisting of as its elements for all non-empty subsets of . Such a vector is called an entropic vector.

Read more about this topic:  Entropic Vector

Famous quotes containing the word definition:

    Mothers often are too easily intimidated by their children’s negative reactions...When the child cries or is unhappy, the mother reads this as meaning that she is a failure. This is why it is so important for a mother to know...that the process of growing up involves by definition things that her child is not going to like. Her job is not to create a bed of roses, but to help him learn how to pick his way through the thorns.
    Elaine Heffner (20th century)

    The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.
    William James (1842–1910)

    Was man made stupid to see his own stupidity?
    Is God by definition indifferent, beyond us all?
    Is the eternal truth man’s fighting soul
    Wherein the Beast ravens in its own avidity?
    Richard Eberhart (b. 1904)