History of natural language processing
- History of machine translation
- History of automated essay scoring
- History of natural language user interface
- History of natural language understanding
- History of optical character recognition
- History of question answering
- History of speech synthesis
- Turing test – test of a machine's ability to exhibit intelligent behavior, equivalent to or indistinguishable from, that of an actual human. In the original illustrative example, a human judge engages in a natural language conversation with a human and a machine designed to generate performance indistinguishable from that of a human being. All participants are separated from one another. If the judge cannot reliably tell the machine from the human, the machine is said to have passed the test. The test was introduced by Alan Turing in his 1950 paper "Computing Machinery and Intelligence," which opens with the words: "I propose to consider the question, 'Can machines think?'"
- Universal grammar – theory in linguistics, usually credited to Noam Chomsky, proposing that the ability to learn grammar is hard-wired into the brain. The theory suggests that linguistic ability manifests itself without being taught (see poverty of the stimulus), and that there are properties that all natural human languages share. It is a matter of observation and experimentation to determine precisely what abilities are innate and what properties are shared by all languages.
- ALPAC – was a committee of seven scientists led by John R. Pierce, established in 1964 by the U. S. Government in order to evaluate the progress in computational linguistics in general and machine translation in particular. Its report, issued in 1966, gained notoriety for being very skeptical of research done in machine translation so far, and emphasizing the need for basic research in computational linguistics; this eventually caused the U. S. Government to reduce its funding of the topic dramatically.
- Conceptual dependency theory – a model of natural language understanding used in artificial intelligence systems. Roger Schank at Stanford University introduced the model in 1969, in the early days of artificial intelligence. This model was extensively used by Schank's students at Yale University such as Robert Wilensky, Wendy Lehnert, and Janet Kolodner.
- Augmented transition network – type of graph theoretic structure used in the operational definition of formal languages, used especially in parsing relatively complex natural languages, and having wide application in artificial intelligence. Introduced by William A. Woods in 1970.
- Distributed Language Translation (project) –
Read more about this topic: Natural Language Processing Toolkits
Famous quotes containing the words history of, history, natural and/or language:
“We know only a single science, the science of history. One can look at history from two sides and divide it into the history of nature and the history of men. However, the two sides are not to be divided off; as long as men exist the history of nature and the history of men are mutually conditioned.”
—Karl Marx (18181883)
“Boys forget what their country means by just reading the land of the free in history books. Then they get to be men, they forget even more. Libertys too precious a thing to be buried in books.”
—Sidney Buchman (19021975)
“The eye of genius has always a plaintive expression, and its natural language is pathos.”
—Lydia M. Child (18021880)
“We might hypothetically possess ourselves of every technological resource on the North American continent, but as long as our language is inadequate, our vision remains formless, our thinking and feeling are still running in the old cycles, our process may be revolutionary but not transformative.”
—Adrienne Rich (b. 1929)