History of natural language processing
- History of machine translation
- History of automated essay scoring
- History of natural language user interface
- History of natural language understanding
- History of optical character recognition
- History of question answering
- History of speech synthesis
- Turing test – test of a machine's ability to exhibit intelligent behavior, equivalent to or indistinguishable from, that of an actual human. In the original illustrative example, a human judge engages in a natural language conversation with a human and a machine designed to generate performance indistinguishable from that of a human being. All participants are separated from one another. If the judge cannot reliably tell the machine from the human, the machine is said to have passed the test. The test was introduced by Alan Turing in his 1950 paper "Computing Machinery and Intelligence," which opens with the words: "I propose to consider the question, 'Can machines think?'"
- Universal grammar – theory in linguistics, usually credited to Noam Chomsky, proposing that the ability to learn grammar is hard-wired into the brain. The theory suggests that linguistic ability manifests itself without being taught (see poverty of the stimulus), and that there are properties that all natural human languages share. It is a matter of observation and experimentation to determine precisely what abilities are innate and what properties are shared by all languages.
- ALPAC – was a committee of seven scientists led by John R. Pierce, established in 1964 by the U. S. Government in order to evaluate the progress in computational linguistics in general and machine translation in particular. Its report, issued in 1966, gained notoriety for being very skeptical of research done in machine translation so far, and emphasizing the need for basic research in computational linguistics; this eventually caused the U. S. Government to reduce its funding of the topic dramatically.
- Conceptual dependency theory – a model of natural language understanding used in artificial intelligence systems. Roger Schank at Stanford University introduced the model in 1969, in the early days of artificial intelligence. This model was extensively used by Schank's students at Yale University such as Robert Wilensky, Wendy Lehnert, and Janet Kolodner.
- Augmented transition network – type of graph theoretic structure used in the operational definition of formal languages, used especially in parsing relatively complex natural languages, and having wide application in artificial intelligence. Introduced by William A. Woods in 1970.
- Distributed Language Translation (project) –
Read more about this topic: Natural Language Processing Toolkits
Famous quotes containing the words history, natural and/or language:
“Philosophy of science without history of science is empty; history of science without philosophy of science is blind.”
—Imre Lakatos (19221974)
“If, in looking at the lives of princes, courtiers, men of rank and fashion, we must perforce depict them as idle, profligate, and criminal, we must make allowances for the rich mens failings, and recollect that we, too, were very likely indolent and voluptuous, had we no motive for work, a mortals natural taste for pleasure, and the daily temptation of a large income. What could a great peer, with a great castle and park, and a great fortune, do but be splendid and idle?”
—William Makepeace Thackeray (18111863)
“Surrealism is not a school of poetry but a movement of liberation.... A way of rediscovering the language of innocence, a renewal of the primordial pact, poetry is the basic text, the foundation of the human order. Surrealism is revolutionary because it is a return to the beginning of all beginnings.”
—Octavio Paz (b. 1914)