- Existential risks are those that threaten the extinction of humanity or, somewhat more broadly, those “where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential” (Bostrom, 2002). While it has long been widely recognized that global nuclear war and catastrophic climate change have potentially civilization-threatening consequences, it is only in the last decade or two that scholars have begun, in work that is typically highly interdisciplinary, to systematically investigate a broader range of existential risks. A landmark event was a conference on global catastrophic risks in Oxford in 2008, and the accompanying book edited by Bostrom and Cirkovic (2008). Subsequent research has tended to confirm the impression from that event that, in the time frame of a century or so, natural risks (such as asteroid impacts) are out shadowed by anthropogenic ones. In addition, while much work remains to be done in identifying and understanding the various risks, there is at present an increasing focus also on how to avoid or mitigate them.
We are not yet at a stage where the study of existential risk is established as an academic discipline in its own right. Attempts to move in that direction are warranted by the importance of such research (considering the magnitude of what is at stake). One such attempt took place in Gothenburg, Sweden, during the fall of 2017: an international guest researcher program on existential risk at Chalmers University of Technology and the University of Gothenburg, featuring daily seminars and other research activities over the course of two months, with Anders Sandberg serving as scientific leader of the program and Olle Häggström as chief local organizer, and with participants from a broad range of academic disciplines. The nature of this program brought substantial benefits in community building and in building momentum for further work in the field: of which the contributions here are one reflection. The present special issue of Foresight is devoted to research carried out and/or discussed in detail at that program. All in all, the issue collects ten papers that have made it through the peer review process.
- Phil Torres: Facing disaster: the great challenges framework.
- Karin Kuhlemann: Complexity, creeping normalcy and conceit: sexy and unsexy catastrophic risks.
- Seth D. Baum, Stuart Armstrong, Timoteus Ekenstedt, Olle Häggström, Robin Hanson, Karin Kuhlemann, Matthijs M. Maas, James D. Miller, Markus Salmela, Anders Sandberg, Kaj Sotala, Phil Torres, Alexey Turchin and Roman V. Yampolskiy: Long-term trajectories of human civilization.
- Anders Sandberg: There is plenty of time at the bottom: the economics, risk and ethics of time compression.
- Alexey Turchin and Brian Patrick Green: Islands as refuges for surviving global catastrophes.
- David Denkenberger, Joshua Pearce, Andrew Ray Taylor and Ryan Black: Food without sun: price and life-saving potential.
- James D. Miller: When two existential risks are better than one.
- Roman V. Yampolskiy: Predicting future AI failures from historic examples.
- Olle Häggström: Challenges to the Omohundro–Bostrom framework for AI motivations.
- Karim Jebari and Joakim Lundborg: The intelligence explosion revisited.
Inga kommentarer:
Skicka en kommentar