Perceptron - History

History

See also: History of artificial intelligence, AI winter and Frank Rosenblatt

Although the perceptron initially seemed promising, it was eventually proved that perceptrons could not be trained to recognise many classes of patterns. This led to the field of neural network research stagnating for many years, before it was recognised that a feedforward neural network with two or more layers (also called a multilayer perceptron) had far greater processing power than perceptrons with one layer (also called a single layer perceptron). Single layer perceptrons are only capable of learning linearly separable patterns; in 1969 a famous book entitled Perceptrons by Marvin Minsky and Seymour Papert showed that it was impossible for these classes of network to learn an XOR function. It is often believed that they also conjectured (incorrectly) that a similar result would hold for a multi-layer perceptron network. However, this is not true, as both Minsky and Papert already knew that multi-layer perceptrons were capable of producing an XOR Function. (See the page on Perceptrons for more information.) Three years later Stephen Grossberg published a series of papers introducing networks capable of modelling differential, contrast-enhancing and XOR functions. (The papers were published in 1972 and 1973, see e.g.: Grossberg, Contour enhancement, short-term memory, and constancies in reverberating neural networks. Studies in Applied Mathematics, 52 (1973), 213-257, online ). Nevertheless the often-miscited Minsky/Papert text caused a significant decline in interest and funding of neural network research. It took ten more years until neural network research experienced a resurgence in the 1980s. This text was reprinted in 1987 as "Perceptrons - Expanded Edition" where some errors in the original text are shown and corrected.

More recently, interest in the perceptron learning algorithm increased again after Freund and Schapire (1998) presented a voted formulation of the original algorithm (attaining a large margin) to which the kernel trick can be applied. Subsequent studies have shown its applicability to a class of more complex tasks, later called structured learning, than binary classification (Collins, 2002), and to large-scale machine learning problems in a distributed computing setting (McDonald, Hall and Mann, 2010).

Read more about this topic:  Perceptron

Famous quotes containing the word history:

    Postmodernism is, almost by definition, a transitional cusp of social, cultural, economic and ideological history when modernism’s high-minded principles and preoccupations have ceased to function, but before they have been replaced with a totally new system of values. It represents a moment of suspension before the batteries are recharged for the new millennium, an acknowledgment that preceding the future is a strange and hybrid interregnum that might be called the last gasp of the past.
    Gilbert Adair, British author, critic. Sunday Times: Books (London, April 21, 1991)

    The whole history of civilisation is strewn with creeds and institutions which were invaluable at first, and deadly afterwards.
    Walter Bagehot (1826–1877)

    The principle that human nature, in its psychological aspects, is nothing more than a product of history and given social relations removes all barriers to coercion and manipulation by the powerful.
    Noam Chomsky (b. 1928)