Relation To Entropy
It can be shown that for the output of Markov information sources, Kolmogorov complexity is related to the entropy of the information source. More precisely, the Kolmogorov complexity of the output of a Markov information source, normalized by the length of the output, converges almost surely (as the length of the output goes to infinity) to the entropy of the source.
Read more about this topic: Kolmogorov Complexity
Famous quotes containing the words relation to, relation and/or entropy:
“The difference between objective and subjective extension is one of relation to a context solely.”
—William James (18421910)
“Every word was once a poem. Every new relation is a new word.”
—Ralph Waldo Emerson (18031882)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)