Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
The technological singularity : managing the journey
Callaghan V., Miller J., Yampolskiy R., Armstrong S., Springer International Publishing, New York, NY, 2017. 261 pp. Type: Book (978-3-662540-31-2)
Date Reviewed: May 17 2018

Research progresses toward the goal of developing an artificial general intelligence (AGI), the realization of which is referred to as “the singularity.” That term carries the implication that AGI will cause fundamental and profound changes to human society. Researchers from many disciplines have raised questions as to the possible dangers as well as moral and ethical issues such a realization would create. A major problem in this area of research is that we currently fail to firmly grasp what constitutes human consciousness or intelligence, much less what might emerge from a research laboratory. An AGI might be intellectually superior to any human but without the innate elements of the human mind that make us what we are. Or, it might be created with such elements but behave unpredictably in either case. This problem is highlighted by observations of current deep learning systems, in which the systems sometimes produce results unanticipated by the researchers, and in which researchers are unable to backtrack to the inputs. In other words, the system functions as a black box taking inputs and producing outputs, but not revealing the process from one to the other.

The technological singularity is a collection of papers that consider various aspects of what the development of an AGI implies. The material is organized in three parts. Part 1 discusses the risks that might occur if an AGI is realized. Part 2 considers how we might manage the development process to mitigate the risks. Part 3 offers reflections on the journey toward the singularity, including a reprint of Vernor Vinge’s 1993 essay that defined the use of “singularity” in this area. As previously noted, most information about the singularity is speculative, leading to a wide diversity of opinions as to whether the singularity could happen at all to the idea that it could lead to the extinction of humanity. The book offers papers about research in multiple areas that run the gamut of these opinions, including technical, moral, ethical, and legal views, all of which must be considered when contemplating an intelligence at least equal to but maybe far greater than human and possibly sentient and self-aware.

One problem with a book such as this can be a lack of cohesion. The introduction gives a brief definition of “intelligence” when a somewhat more comprehensive one, including areas of gaps in our understanding, would be preferable to give the reader better context for the discussions that follow. Similarly, an overview or high-level model of the components of an AGI also would be useful for the reader to set the context for the papers that follow. Another problem in writings about AGI is that authors discuss concepts with which humanity has no actual experience (although perhaps we see glimpses as AI systems gain increasing capability). This lack of hard facts can lead to a tendency to anthropomorphize the concept of AGI (we write about what we know, after all) and to leave the reader to infer how accurate the characterization of an AGI is. For example, we see statements such as “the AGI achieving its goals” without any background on how the AGI would do this. It would be much clearer for the reader if the introduction were to contain a model of an AGI showing a goal-seeking unit that can recursively set subgoals in pursuing a top-level goal. This might be set by the developer, or even the AGI, but even a marginally intelligent AGI must be able to set subgoals in pursuit of a higher goal.

One final critique: I am always irked by statements by authors claiming to state true scientific “facts” that have been shown to be either false or at least unsettled science. One paper here repeats as fact the unproven theory that human production of carbon dioxide makes a meaningful contribution to global warming. See, for example, the contrary arguments summarized by Richard Lindzen, emeritus Alfred P. Sloan Professor of Meteorology at MIT [1]. Beginning with an erroneous assumption calls into question the credibility of the paper’s arguments and conclusions. Authors have a duty to their readers to find and present all sides of an argument, which this book does very well, as related to the singularity .

There is a wealth of information here, including hundreds of references to recent research. The editors have done a fine job of collecting and presenting all sides of the AGI issue. The papers are well written, and are all relevant (with one exception noted above) to various aspects of AGI that must be considered as research and development continues. Because much of the material is speculative, it will be easy for readers to find ideas with which they disagree, but the ideas are generally well thought out and presented. Overall, this book should be of interest to anyone wishing to learn about the technological singularity and its implications.

Reviewer:  G. R. Mayforth Review #: CR146038 (1807-0371)
1) Lindzen, R. Thoughts on the public discourse over climate change. MerionWest, http://merionwest.com/2017/04/25/richard-lindzen-thoughts-on-the-public-discourse-over-climate-change/, 04/25/2017. Accessed 05/10/2018.
Bookmark and Share
  Reviewer Selected
 
 
Philosophical Foundations (I.2.0 ... )
 
 
Regulation (K.5.2 ... )
 
 
Public Policy Issues (K.4.1 )
 
 
Arts And Humanities (J.5 )
 
Would you recommend this review?
yes
no
Other reviews under "Philosophical Foundations": Date
Rethinking smart objects
Rasmus D. (ed), Cambridge University Press, New York, NY, 1999. Type: Book (9780521645492)
Mar 1 1999
Other bodies, other minds
Harnad S. Minds and Machines 1(1): 43-54, 1991. Type: Article
Nov 1 1991
Do the right thing
Russell S., Wefald E., MIT Press, Cambridge, MA, 1991. Type: Book (9780262181440)
Aug 1 1992
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy