Definition
Consider n jointly distributed random variables with a joint probability density function . Let be a subset of . Now we define where . Clearly there are 2n−1 non-empty subsets of . Corresponding to each, we have the joint entropy defined as . A vector in consisting of as its elements for all non-empty subsets of . Such a vector is called an entropic vector.
Read more about this topic: Entropic Vector
Famous quotes containing the word definition:
“One definition of man is an intelligence served by organs.”
—Ralph Waldo Emerson (18031882)
“Im beginning to think that the proper definition of Man is an animal that writes letters.”
—Lewis Carroll [Charles Lutwidge Dodgson] (18321898)
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)