The Vingean Singularity [Obsolete]

5 Conversations

This version is obsolete, but I left it here for egoistic and nostalgic reasons. See the Edited version.


There are scientists who say that technological progress will accelerate indefinitely, causing a mind-boggling change in the basic principles of what we humans affectionately call 'home'. It is a well-known fact among Singularitarians that this event, known as the Singularity1, will occur sometime before the year 2100. Some attribute this hypothetical event to Vernor Vinge and claim that it is first mentioned in his sci-fi novel True Names and Other Dangers, one of the author's answers to a very interesting question: What would happen if the human race created an artificial intelligence which is more intelligent than its creators? A character in the book finds himself 'precipitated over an abyss' when trying to predict future technology by extrapolating current trends. Vinge feels that when humans create an intelligence greater than their own, 'human history will have reached a kind of singularity'.


Although the Singularity is sometimes referred to as the brainchild of Vinge, it was briefly mentioned in the 1950s by John von Neumann, as Raymond Kurzweil points out in his excellent book précis The Singularity is Near. John von Neumann observed that 'the ever accelerating progress of technology [...] gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue'.

A Brief Explanation


So what do Vinge and von Neumann mean when they use the term 'singularity'? It is essentially a mathematical analogy that can be interpreted either literally or figuratively. This entry presents both interpretations, and since the literal is more or less the basis of the figurative, the literal is presented first.


Imagine a curve that represents the technological progress of human beings throughout history. Most people would agree that we have come a long way in only a century; we've invented new technology, learned new things and developed as a race. Thus the curve slopes upward.


We can examine this curve and extrapolate to create a function that describes the level of progress for any given point in time. Many have done this, including Ray Kurzweil, the renowned inventor of synthesizers and text-to-speech machines and also author of The Age of Intelligent Machines, a book that won the Association of American Publishers' Award for the Most Outstanding Computer Science Book of 1990. Kurzweil has shown that the doubling period of the speed of computers is diminishing, ie it used to take us three years to double the speed and memory capacity of computers in the beginning of the 20th Century, and now the same kind of progress is achieved in only one year. He claims that these trends will continue, and that computers will be able to emulate human brains in the year 2020.


When and if computers become intelligent, computers themselves could construct new computers, causing a massive acceleration in technological progress. One way to visualise this acceleration is to consider the following rather trivial function.

f(t) = -1 / t


Readers with working knowledge of mathematics can see that this function gives us very large values for very small negative values of t. As t approaches zero (from the negative side), the value of the function approaches infinity. This is called hyperbolic growth. A hyperbolic function grows much faster than an exponential function, as the reader can easily see in the following table. Note that t takes on negative values; think of it as a countdown to the Singularity.

Exponential vs Hyperbolic
Time (t)Exponential (2t+1)Hyperbolic (-1/t)
-2.00.50.5
-1.50.70.67
-1.011
-0.51.412
-0.011.986100
-0.0000012 (approx.)one million
02undefined


Exponential functions have well-defined outputs for all inputs, as opposed to our hyperbolic function f(t). At the t-value of zero, the function is undefined (since division by zero is undefined). This is a mathematical singularity, corresponding to the Singularity. If we extend the analogy a bit, negative values of t represent the time before the Singularity.


If we interpret the Singularity literally, it would mean that we'll make nearly infinite progress (whatever that means) in finite time, which is a rather absurd notion that is impossible to fully grasp. According to Vinge, 'new models must be applied' for humans to have the slightest chance of understanding what's going on.


The figurative interpretation of the concept of infinite progress is not only more believable but also much easier to visualise. It is presented in the book Can We Avoid a Third World War around 2010? by Peter Peeters. The author has extrapolated and found the hyperbolic growth mentioned above, but uses this method strictly to attract attention to possible crisis points. He claims that this method can predict points in time where there's a change so fundamental as to be called 'the end of the world as we know it' by pessimists, and has shown that his claims have some validity by showing that some historical crises such as the First World War coincide with his analytical crisis points. It should be noted, however, that the Second World War did not fit the model.


Thus, the conservative view of the Singularity is that it is a point of change or crisis. Another way of putting it is that the trends are all there, but the significance and results of these trends are still being discussed.

Paths to the Singularity


The Singularity can be reached in a number of different ways. Here are some examples.

  • Neurohacking

    In this scenario, humans learn how to improve the brain either purely biologically or by introducing some kind of technology in the human brain, ie cybernetics. This ties in a bit with transhumanism, and the resulting positive feedback loop is what gives rise to the Singularity.

  • Self-enhancing computers

    Most people who know a bit about computers know about Moore's Law. However, what would happen if we create an artificial intelligence that is just as smart as us and let it construct new computers? It's like a text-book example of positive feedback loops. At first, there would be no difference; processing power would double in another 18 months. After that, since the AI is working faster, processing power would double in 9 months2, 4.5 months, et cetera, and presto! We've reached the Singularity.

  • Uploading

    Instead of creating intelligent software the hard way by writing all the code, why not just tap into the software already in our own heads? Some claim that it is theoretically possible to emulate a human brain (using so called 'whole brain emulation'), and that it will be possible to upload a person's mind into a computer. That would create a sentient computer program that can enhance its own code, causing the positive feedback loop that in turn causes the Singularity.

  • Nanotechnology

    It is already possible to construct extremely small engines at the molecular level. This is called 'nanotechnology', and is often closely associated with so called 'nanobots', hypothetical robots that measure only a few nanometers across. Theoretically, nanotech could be used for neurohacking, to create an intelligent computer, or for whole brain emulation. A good source of information for the aspiring nanotechie is the Foresight Institute, a 'nonprofit educational organization formed to help prepare society for anticipated advanced technologies'. Another important nanotech website is that of Zyvex, the first molecular nanotechnology company.

Singularitarians


Although most of this sounds like pure science fiction, it is like most sci-fi slowly becoming science. Some people claim that it is perfectly logical to expect the Singularity to occur no later than the year 2010, if we work hard enough. These are of course Singularitarians. Note that the Singularitarians are not really an organized community; it's more a term for people that are actively working towards the Singularity. This section lists some of the attempts at making an organized effort to reach the Singularity.


Among the people working towards the Singularity are the Singularity Institute, who feel that we don't have much time before research on nanotechnology is completed. They often cite the 'gray goo problem'3, and claim that to avoid a global catastrophe, we need to reach the Singularity caused by a self-enhancing AI. An h2g2 Researcher commented on this, saying that 'the prospect of the human race being relegated to second place by AIs is a lot more worrying than politicians gaining another weapon of mass destruction (after all, we already have the nuke). The "active shield" concept4 of nanotech offers some kind of safeguards - whereas super-AIs would, by their nature, get round any safeguards we put up.'


The person who seems to be the most motivated agent of the Singularity is of course also a founder of the Singularity Institute. His name is Eliezer Yudkowsky, and his website The Low Beyond is a vast source of information pertaining to the Singularity.


John Smart, a private teacher from Los Angeles, has a relatively conservative attitude towards the Singularity. He is making people aware of the Singularity through his website Singularity Watch, a source of carefully developed writing on the subject at an introductory and multidisciplinary level. According to the website, a 'singularity watcher' is 'neither absolutely convinced - nor uncritically happy - that the singularity is going to happen, but they do believe this issue deserves serious scientific investigation'. (Due to his more conservative approach, Smart doesn't capitalise 'Singularity'.)


The proposed society called the Singularity Club has an attitude towards the Singularity that is diametrically opposite Smart's. It is a more radical and controversial movement that claims that 'one must have the desire to not only survive the Singularity, but to ride its powerful wave right to Ascension (godhood). To, in effect, become the Singularity.'

Conclusion


It is important to stress that these are all speculations, conjectures and theories. Like all theories, the Singularity is subject to debate. Many point to the laws of physics; an infinitely fast computer-making computer would probably generate so much heat that it would incinerate the planet. Others point out that positive feedback can only take us so far, and that growth grinds to a halt after a certain threshold, much more resembling a sigmoidal curve5 rather than a hyperbolic curve. An h2g2 Researcher described how positive feedback relates to sigmoidal curves, in the following way: 'Hold a microphone up to a loudspeaker playing back the microphone signal. What happens? Yes, but how loud does it get? There are several built-in limitations to the system.'


There is no real evidence that the Singularity will occur, simply because it hasn't occured yet. Just like it's impossible to make a 100% accurate weather forecast, it's impossible to know if a certain event will ever come to pass. The reason that it does seem as if we might reach the Singularity is that there are many ways to reach it and many current trends point in that direction.


Many information theorists, evolution theorists and computer scientists come to the same conclusion: We will reach the Singularity, whatever it may be, in one way or another.

1In this entry, the capitalised term 'Singularity' refers to the technological singularity predicted by Vinge, Kurzweil et al. The term 'singularity' (lower case 's') refers to singularities in general, such as mathematical singularities and the center of a black hole.2Remember, this is still equivalent to 18 months of work for human researchers, but the artificial intelligence is working in double-speed.3What might happen if one were to construct nanotech weapons that could disassemble the entire Earth atom by atom in a single week? Read about the 'gray goo problem' in K. Eric Drexler's Engines of Creation, chapter 11.4Also in chapter 11 of K. Eric Drexler's Engines of Creation.5A curve/function S(x) that resembles the hyperbolic curve in that it's steep near x = 0, but eventually levels out and is more or less constant for large positive values of x.

Bookmark on your Personal Space


Entry

A512902

Infinite Improbability Drive

Infinite Improbability Drive

Read a random Edited Entry


Written and Edited by

Disclaimer

h2g2 is created by h2g2's users, who are members of the public. The views expressed are theirs and unless specifically stated are not those of the Not Panicking Ltd. Unlike Edited Entries, Entries have not been checked by an Editor. If you consider any Entry to be in breach of the site's House Rules, please register a complaint. For any other comments, please visit the Feedback page.

Write an Entry

"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."

Write an entry
Read more