"Vernor Vinge. The Coming Technological Singularity: How to Survive in the Post-Human Era" - читать интересную книгу автора

_Can the Singularity be Avoided?_

Well, maybe it won't happen at all: Sometimes I try to imagine
the symptoms that we should expect to see if the Singularity is not to
develop. There are the widely respected arguments of Penrose [19] and
Searle [22] against the practicality of machine sapience. In August
of 1992, Thinking Machines Corporation held a workshop to investigate
the question "How We Will Build a Machine that Thinks" [27]. As you
might guess from the workshop's title, the participants were not
especially supportive of the arguments against machine intelligence.
In fact, there was general agreement that minds can exist on
nonbiological substrates and that algorithms are of central importance
to the existence of minds. However, there was much debate about the
raw hardware power that is present in organic brains. A minority felt
that the largest 1992 computers were within three orders of magnitude
of the power of the human brain. The majority of the participants
agreed with Moravec's estimate [17] that we are ten to forty years
away from hardware parity. And yet there was another minority who
pointed to [7] [21], and conjectured that the computational competence
of single neurons may be far higher than generally believed. If so,
our present computer hardware might be as much as _ten_ orders of
magnitude short of the equipment we carry around in our heads. If this
is true (or for that matter, if the Penrose or Searle critique is
valid), we might never see a Singularity. Instead, in the early '00s
we would find our hardware performance curves beginning to level off
-- this because of our inability to automate the design work needed to
support further hardware improvements. We'd end up with some _very_
powerful hardware, but without the ability to push it further.
Commercial digital signal processing might be awesome, giving an
analog appearance even to digital operations, but nothing would ever
"wake up" and there would never be the intellectual runaway which is
the essence of the Singularity. It would likely be seen as a golden
age ... and it would also be an end of progress. This is very like the
future predicted by Gunther Stent. In fact, on page 137 of [25],
Stent explicitly cites the development of transhuman intelligence as a
sufficient condition to break his projections.

But if the technological Singularity can happen, it will. Even
if all the governments of the world were to understand the "threat"
and be in deadly fear of it, progress toward the goal would continue.
In fiction, there have been stories of laws passed forbidding the
construction of "a machine in the likeness of the human mind" [13].
In fact, the competitive advantage -- economic, military, even
artistic -- of every advance in automation is so compelling that
passing laws, or having customs, that forbid such things merely
assures that someone else will get them first.

Eric Drexler [8] has provided spectacular insights about how far
technical improvement may go. He agrees that superhuman intelligences
will be available in the near future -- and that such entities pose a