Technological Singularity - Intelligence Explosion

Intelligence Explosion

The notion of an "intelligence explosion" was first described thus by Good (1965), who speculated on the effects of superhuman machines:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

Most proposed methods for creating superhuman or transhuman minds fall into one of two categories, intelligence amplification of human brains and artificial intelligence. The means speculated to produce intelligence augmentation are numerous, and include bioengineering, genetic engineering, nootropic drugs, AI assistants, direct brain-computer interfaces and mind uploading. The existence of multiple paths to an intelligence explosion makes a singularity more likely; for a singularity to not occur they would all have to fail.

Hanson (1998) is skeptical of human intelligence augmentation, writing that once one has exhausted the "low-hanging fruit" of easy methods for increasing human intelligence, further improvements will become increasingly difficult to find. Despite the numerous speculated means for amplifying human intelligence, non-human artificial intelligence (specifically seed AI) is the most popular option for organizations trying to advance the singularity.

Whether or not an intelligence explosion occurs depends on three factors. The first, accelerating factor, is the new intelligence enhancements made possible by each previous improvement. Contrariwise, as the intelligences become more advanced, further advances will become more and more complicated, possibly overcoming the advantage of increased intelligence. Each improvement must be able to beget at least one more improvement, on average, for the singularity to continue. Finally, there is the issue of a hard upper limit. Absent quantum computing, eventually the laws of physics will prevent any further improvements.

There are two logically independent, but mutually reinforcing, accelerating effects: increases in the speed of computation, and improvements to the algorithms used. The former is predicted by Moore’s Law and the forecast improvements in hardware, and is comparatively similar to previous technological advance. On the other hand, most AI researchers believe that software is more important than hardware.

Read more about this topic:  Technological Singularity

Famous quotes containing the words intelligence and/or explosion:

    Nota: man is the intelligence of his soil,
    The sovereign ghost. As such, the Socrates
    Of snails, musician of pears, principium
    And lex. Sed quæritur: is this same wig
    Of things, this nincompated pedagogue,
    Preceptor to the sea?
    Wallace Stevens (1879–1955)

    Frau Stöhr ... began to talk about how fascinating it was to cough.... Sneezing was much the same thing. You kept on wanting to sneeze until you simply couldn’t stand it any longer; you looked as if you were tipsy; you drew a couple of breaths, then out it came, and you forgot everything else in the bliss of the sensation. Sometimes the explosion repeated itself two or three times. That was the sort of pleasure life gave you free of charge.
    Thomas Mann (1875–1955)