Existential risks
An existential risk is one that threatens the entire future of humanity. More specifically, existential risks are those that threaten the extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development.
Types
Nick Bostrom classifies existential risks in the following categories1:
Bangs – Earth-originating intelligent life goes extinct in a relatively sudden disaster resulting from either an accident or a deliberate act of destruction.
Crunches – The potential of humankind to develop into posthumanity[7] is permanently thwarted although human life continues in some form.
Shrieks – Some form of posthumanity is attained but it is an extremely narrow band of what is possible and desirable.
Whimpers – A posthuman civilization arises but evolves in a direction that leads gradually but irrevocably to either the complete disappearance of the things we value or to a state where those things are realized to only a minuscule degree of what could have been achieved.
Anthropogenic risks are those that would be the result of human activity, and natural risks are those that would be caused by nature. There could also be risks from alien civilizations.
Interventions
Retain a last-resort readiness for preemptive action
Recovery from a global catastrophe
Differential technological development (see: Differential progress)
Support programs that directly reduce specific existential risks
List of existential risks
By causes
Anthropogenic risks can be further categorized by whether they are intentional or accidental, and looking at the associated economic incentives.
Technological x-risks:
- AI safety (related: Whole brain emulation)
- Biosecurity
- Climate change
- Nuclear security
- Physical disasters
- Other: Emerging technologies assessment
Other anthropogenic x-risks:
Natural x-risks:
Other:
- Aliens
- Simulation shutdown (see: Simulation hypothesis)
By mechanism
Bangs:
- Nanotechnology
- Nuclear holocaust (see: Nuclear security)
- Simulation shutdown (see: Simulation hypothesis)
- Badly programmed superintelligence (see: AI safety)
- Bioengineered pandemics (see: Biosecurity)
- Physical disasters
- Naturally occuring pandemics (see: Biosecurity)
- Asteroid or comet impact
- Runaway global warming (see Climate change)
Crunches:
- Resource depletion or ecological destruction (see Environmentalism)
- Misguided world government or another static social equilibrium stops technological progress (see Preventing totalitarianism)
- “Dysgenic” pressures
- Technological arrest
Shrieks:
- Take-over by a transcending upload (see Whole brain emulation)
- Flawed superintelligence (see AI safety)
- Repressive totalitarian global regime (see Preventing totalitarianism)
Whimpers:
- Our potential or even our core values are eroded by evolutionary development (see Population control)
- Killed by an extraterrestrial civilization (see Aliens)
By type of risk2
Transparent risks
Opaque risks
Knightian risks:
- Black swans
- Dynamic environment
- Adversarial environments
Mitigation of Knightian risks:
- Antifragility
- Effectuation
- Capability enhancement
Prize
The Future of Life Institute offers the Future of Life Award to individuals that help prevent existential risks (source).
Organizations
Organizations working on a specific existential risk are only mentioned on the specific page for existential risk.
Main focus:
- Survival and Flourishing
- Future of Humanity Institute
- Future of Life Institute
- Global Catastrophic Risk Institute
- The Cambridge Centre for the Study of Existential Risk
- The Institute for Ethics and Emerging Technology
- Machine Intelligence Research Institute
- Center for Security and Emerging Technology
- OpenAI
- AllFed
- Center for Long Term Priorities (CLTP)
- The Existential Risk Research Network
- Legal Priorities Project
- Center on Long-Term Risk
- Existential Risk Observatory
- Longview Philanthropy
Related:
Grants
The Open Philanthropy Project with Good Venture is one of the main founders in this space as can be seen in their grant database.
Other major grantors include Survival and Flourishing, the Future of Humanity Institute, and the Skoll Global Threats Fund
See also
External links
- Existential risks website
- Existential risks by Nick Bostrom
- Open Philanthropy Project review of Global Catastrophic Risks
- Global catastrophic risk on Wikipedia
- The case for reducing extinction risk by 80,000 Hours
- Existential risk by the Future of Life Institute
- This comment that considers a “scorched earth” strategy in the context of differential intellectual progress (this is also something to think about regarding SIMADs)
- Existential Risk and Growth
- Governing Boring Apocalypses: A new typology of existential vulnerabilities and exposures for existential risk research
- Stanford Existential Risks Initiative (SERI)