Definition
Consider n jointly distributed random variables with a joint probability density function . Let be a subset of . Now we define where . Clearly there are 2n−1 non-empty subsets of . Corresponding to each, we have the joint entropy defined as . A vector in consisting of as its elements for all non-empty subsets of . Such a vector is called an entropic vector.
Read more about this topic: Entropic Vector
Famous quotes containing the word definition:
“Scientific method is the way to truth, but it affords, even in
principle, no unique definition of truth. Any so-called pragmatic
definition of truth is doomed to failure equally.”
—Willard Van Orman Quine (b. 1908)
“One definition of man is an intelligence served by organs.”
—Ralph Waldo Emerson (18031882)
“The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.”
—William James (18421910)