Can there be a Singularity without superintelligence (or vice versa)?
Mitchell Howe:
Strictly speaking, it is possible for there to be both a Singularity that does not entail the creation of superintelligence, and for supertintelligence to not trigger the onset of a Singularity. Both are improbable, regardless of the specific criteria used to define Singularity or superintelligence, but some of the potential "loopholes" are worth discussing.
The potential for Singularity without superintelligence depends largely on which variant of the Singularity is being used. A predictive horizon, for example, can be reached if it is anchored at some particular date (which it never really is, in my experience.) In fact, if this date is sufficiently far back in our past, one could argue (uselessly) that we are living in a Singularity now. Also, if a Singularity is considered reached when the distance to the predictive horizon becomes sufficiently small, our own lack of foresight, not the arrival of superintelligence, may turn out to be the cause. If the idea of a developmental Singularity is used, it is possible that existing trends in automation will result in sharply spiking productivity without the need for any greater intelligence. Finally, even the "greater intelligence" definition of Singulairty need not neccessarily mean the arrival of superintelligence -- which implies minds vastly more intelligent than we are now. In each of these cases, however, one must wonder how long exponentially spiking rates of progress, foreseeable or otherwise, could contiune before superintelligence appeared as one of the many new products of such an age -- or before slightly greater intelligence helped design superintelligent successors. So, the Singularity has a very reasonable chance of preceeding superintelligence, but probably not by much. As other parts of this Q&A discuss, it would be very surprising if greater intelligence proved to be impossible or limited.
On the flip side of this question, that of superintelligence without Singularity, the salient concern is for just how "super" and involved superintelligence would be in our own affairs. If superintelligence were surprisingly unimpressive, malicious, or apathetic, its creation would not do much to initiate a Singularity for the rest of us. There are, in fact, a host of such concerns people tend to have about superintelligence, and the most important of these have their own extended responses in this Q&A. For now, let it be said that most of the common concerns are groundless -- based on flawed, if understandable ideas about intelligence -- and that the rest can probably be dealt with through responsible approaches to research and design.
No comments:
Post a Comment