Structural Versus Algorithmic Information Theory
Since the 1960s, SIT (in psychology) and AIT (in computer science) evolved independently as viable alternatives for Shannon's classical information theory which had been developed in communication theory. In Shannon's approach, things are assigned codes with lengths based on their probability in terms of frequencies of occurrence (as, e.g., in the Morse code). In many domains, including perception, such probabilities are hardly quantifiable if at all, however. Both SIT and AIT circumvent this problem by turning to descriptive complexities of individual things.
Although SIT and AIT share many starting points and objectives, there are also several relevant differences:
- First, SIT makes the perceptually relevant distinction between structural and metrical information, whereas AIT does not;
- Second, SIT encodes for a restricted set of perceptually relevant kinds of regularities, whereas AIT encodes for any imaginable regularity;
- Third, in SIT, the relevant outcome of an encoding is a hierarchical organization, whereas in AIT, it is only a complexity value.
Read more about this topic: Structural Information Theory
Famous quotes containing the words structural, information and/or theory:
“The reader uses his eyes as well as or instead of his ears and is in every way encouraged to take a more abstract view of the language he sees. The written or printed sentence lends itself to structural analysis as the spoken does not because the readers eye can play back and forth over the words, giving him time to divide the sentence into visually appreciated parts and to reflect on the grammatical function.”
—J. David Bolter (b. 1951)
“On the breasts of a barmaid in Sale
Were tattooed the prices of ale;
And on her behind
For the sake of the blind
Was the same information in Braille.”
—Anonymous.
“No theory is good unless it permits, not rest, but the greatest work. No theory is good except on condition that one use it to go on beyond.”
—André Gide (18691951)