NoSuchBucketThe specified bucket does not existuswww-amp-staticYCKJE6V6HG8Z382NeZwAMNmpChv2tGu+bBg3h/xri17L4MHPggbIEuPe254I8ZL5U2Cv1a43Jt6A0Z6K+2/1esvVUr8=NoSuchBucketThe specified bucket does not existuswww-amp-staticYCKW65J7GD1RDKE9bND66rI4UMoqYSRvHeqG+WVBePx3gloZZPFOjU4RJ7SNu/510L4UtgnmdLNtEM+pGaR3YoYKluk=
9 crazy things that could happen after the singularity, when robots become smarter than humans
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29N7MB6REN7WBJ8al0JUKzhW8HUCdA1mSCyyubr2u6m2oudpQY2DHH+AD2wOIfQjg+A647I5hcjzRiO9Lu9S8KkrxY=
Futurists say that our destiny will be shaped by the Singularity,
the moment when artificial intelligence surpasses human
intelligence.
Scholars don't agree on the details, but they
say it will happen between 30 and 1000 years from today, with
most predicting it will emerge in the next century.
It will almost certainly have profoundly scary - and deeply
exciting - consequences.
The term 'Singularity' was first used in the technological sense
(as opposed to
its definition within physics) by Hungarian American
mathematician John von Neumann.
In 1958,
he said "ever accelerating progress of technology and changes
in the mode of human life, which gives the appearance of
approaching some essential singularity in the
history of the race beyond which human affairs, as we know them,
can not continue."
An 'intelligence explosion' will allow machines to make better machines.
In
a 1965 essay [PDF], mathematician I. J. Good predicted that
machines will eventually be able to create better machines.
His full quote:
Let an ultraintelligent machine be defined as a machine that can
far surpass all the
intellectual activities of any man however clever.
Since the design of machines is one of these intellectual
activities, an ultraintelligent machine could design even better
machines; there would then unquestionably be an "intelligence
explosion", and the intelligence of man would be left far behind.
Thus the first ultraintelligent machine is the last invention
that man need ever make.
If AI becomes better at designing AI than humans, we'll hit an
intelligence explosion that "ultimately results in machines
whose intelligence exceeds ours by more than ours exceeds that of
snails," Hawking said in a recent Reddit AMA.
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
The machines are (maybe) going to take over.
Alan Turing, the visionary British mathematician played by
Benedict Cumberbatch in "The Imitation Game,"
took a grim view of the singularity.
Not only would machines out-think us, he said, but they'd have no
use for us.
"One the machine thinking method had started, it would not take
long to outstrip our feeble powers,"
he wrote in a 1951 paper.
"There would be no question of the machines dying, and they would
be able to converse with each other to sharpen their wits," he
wrote. "At some stage therefore we should have to expect the
machines to take control."
Some scholars think that the Singularity will be the moment when humans find their successors.
Turing's ideas about artificial intelligence as a kind of
evolution were furthered by Carnegie Mellon University roboticist
Hans Moravec, who says that artificially intelligent machines are
going to "succeed" humanity.
In
"Mind Children: The Future of Robot and Human Intelligence,"
Moravec predicts that robots will become an artificial
species by the 2030s or 2040s. In the same way that homo
sapiens are distinct but related to less intelligent apes,
these artificial life forms will rise from us - but be distinct
from us.
It's the "heirs to humanity" situation spelled out in the
awesome "Battlestar Galactica" and other sci-fi stories.
Artificial life may spring from our intelligence, but won't have
use for it.
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
Artificial intelligence may find that there's no real use for humanity.
He's said that it's "'potentially more dangerous than nukes,"
in a tweet, and recommended reading Nick Bostrom's
"Superintelligence" to understand why.
Bostrom, the founder of Oxford's Future of Humanity Institute,
says that the artificial intelligence promised by the Singularity
would end life as we know it. It's like Turing's predictions
taken to the maximum.
"There are huge existential threats, these are threats to the
very survival of life on Earth, from machine intelligence,"
he said in an interview.
In "Superintelligence," he says that post-Singularity Earth could
end up being "a society of economic miracles and technological
awesomeness, with nobody there to benefit ... A Disneyland
without children."
The economy is going to go bananas.
George Mason University economist Robin Hanson
says that there have been at least two other singularities in
human history - what we now call the Agricultural
and Industrial Revolutions.
The next revolution - the technological singularity - would be
the postmodern equivalent of those world-historic events,
he says, with total economic growth speeding up by 60 to 250
times the current rate.
"The world economy, which now doubles in 15 years or so, would
soon double in somewhere from a week to a month,"
he says.
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
De Gray
says we'll escape aging by finding therapies that can "repair
the molecular and cellular damage of aging" so that people can
stop becoming "biologically older."
The years might go by, but your body won't notice it.
And we'll be able to bring people back.
Google futurist Ray Kurzweil has said multiple
times that he'll be able to "bring back" his father
Frederick Kurzweil through artificial intelligence.
He believes that by the 2030s, we'll be able to send
nanobots into people's brains to extract memories of loved ones.
Kurzweil says that by combining that information with the
information in the deceased's DNA, it will be possible to create
a convincing virtual version of somebody who's passed on.
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
Nanobots will plug our brains directly into the Internet.
Kurzweil
is really into nanobots, tiny robots that could go straight
into our brains and modify consciousness, leading to
brain-to-brain communication and instant learning (like in the
Matrix).
If we can plug our minds directly into the cloud - which
some say is farfetched - then we'll be able to live forever.
Virtually.
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29W2HTEDNGPYX58NljatgzPQ4quw1drLCoMSqqjZcb6dDtbrohoEOXkD7subewJsrgbWV0BwAWSr1lUkDragM3Tlh4=
NoSuchBucketThe specified bucket does not existuswww-amp-staticBJ2F1B7KEFGFKDHHuxEa7Gx41sCoI8S5RrUSqyA3UN6ZEoK3B9wuKADSMo4DiGNK3CditkmgddMIaEBoDzTW+sM/xGc=
NoSuchBucketThe specified bucket does not existuswww-amp-staticK29QGDCAPQ59NQC827IjNq2fR5A2Nl7GeNOjK1V3oaoxjAyFRtMp6QUZRvDTQuAsFviOb28/fzoIQdSnYZ+HPZ1lAm0=