Diversity Index - Shannon Index

Shannon Index

The Shannon index has been a popular diversity index in the ecological literature, where it is also known as Shannon's diversity index, the Shannon-Wiener index, the Shannon-Weaver index and the Shannon entropy. The measure was originally proposed by Claude Shannon to quantify the entropy (uncertainty or information content) in strings of text. The idea is that the more different letters there are, and the more equal their proportional abundances in the string of interest, the more difficult it is to correctly predict which letter will be the next one in the string. The Shannon entropy quantifies the uncertainty (entropy or degree of surprise) associated with this prediction. It is calculated as follows:

where is the proportion of characters belonging to the ith type of letter in the string of interest. In ecology, is often the proportion of individuals belonging to the ith species in the dataset of interest. Then the Shannon entropy quantifies the uncertainty in predicting the species identity of an individual that is taken at random from the dataset.

The base of the logarithm used when calculating the Shannon entropy can be chosen freely. Shannon himself discussed logarithm bases 2, 10 and e, and these have since become the most popular bases in applications that use the Shannon entropy. Each log base corresponds to a different measurement unit, which have been called binary digits (bits), decimal digits (decits) and natural digits (nats) for the bases 2, 10 and e, respectively. Comparing Shannon entropy values that were originally calculated with different log bases requires converting them to the same log base: change from the base a to base b is obtained with multiplication by logba.

It has been shown that the Shannon index is based on the weighted geometric mean of the proportional abundances of the types, and that it equals the logarithm of true diversity as calculated with q = 1:

This can also be written

which equals

Since the sum of the values equals unity by definition, the denominator equals the weighted geometric mean of the values, with the values themselves being used as the weights (exponents in the equation). The term within the parentheses hence equals true diversity 1D, and H' equals log(1D).

When all types in the dataset of interest are equally common, all values equal 1/R, and the Shannon index hence takes the value log(R). The more unequal the abundances of the types, the larger the weighted geometric mean of the values, and the smaller the corresponding Shannon entropy. If practically all abundance is concentrated to one type, and the other types are very rare (even if there are many of them), Shannon entropy approaches zero. When there is only one type in the dataset, Shannon entropy exactly equals zero (there is no uncertainty in predicting the type of the next randomly chosen entity).

Read more about this topic:  Diversity Index

Famous quotes containing the word index:

    Exile as a mode of genius no longer exists; in place of Joyce we have the fragments of work appearing in Index on Censorship.
    Nadine Gordimer (b. 1923)