By "infinite impact," the authors — led by Dennis Pamlin of the Global Challenge Foundation and Stuart Armstrong of the Future of Humanity Institute — mean risks capable of either causing human extinction or leading to a situation where "civilization collapses to a state of great suffering and does not recover."
The good news is that the authors aren't convinced we're doomed. Pamlin and Armstrong are of the view that humans have a long time left — possibly millions of years: "The dinosaurs were around for 135 million years and if we are intelligent, there are good chances that we could live for much longer," they write. Roughly 108 billion people have ever been alive, and Pamlin and Armstrong estimate that, if humanity lasts for 50 million years, the total number of humans who will ever live is more like 3 quadrillion.
That's an optimistic assessment of humanity's prospects, but it also means that if something happens to make humans go extinct, the moral harm done will be immense. Guarding against events with even a small probability of causing that is worthwhile.
So the report's authors conducted a scientific literature review and identified 12 plausible ways it could happen:
1) Catastrophic climate change
According to a 2013 World Bank report, "there is also no certainty that adaptation to a 4°C world is possible." Warming at that level would displace huge numbers of people as sea levels rise and coastal areas become submerged. Agriculture would take a giant hit.
2) Nuclear war
Even significantly larger exchanges fall short of the level of impact Pamlin and Armstrong require. "Even if the entire populations of Europe, Russia and the USA were directly wiped out in a nuclear war — an outcome that some studies have shown to be physically impossible, given population dispersal and the number of missiles in existence — that would not raise the war to the first level of impact, which requires > 2 billion affected," Pamlin and Armstrong write.
So why does nuclear war make the list? Because of the possibility of nuclear winter. That is, if enough nukes are detonated, world temperatures would fall dramatically and quickly, disrupting food production and possibly rendering human life impossible. It's unclear if that's even possible, or how big a war you'd need to trigger it, but if it is a possibility, that means a massive nuclear exchange is a possible cause of human extinction.
3) Global pandemic
Is that plausible? Medicine has improved dramatically since the Spanish flu. But on the flip side, transportation across great distances has increased, and more people are living in dense urban areas. That makes worldwide transmission much more of a possibility.
Even a pandemic that killed off most of humanity would surely leave a few survivors who have immunity to the disease. The risk isn't that a single contagion kills everyone; it's that a pandemic kills enough people that the rudiments of civilization — agriculture, principally — can't be maintained and the survivors die off.
4) Ecological catastrophe
Mass extinctions can happen for a number of reasons, many of which have their own categories on this list: global warming, an asteroid impact, etc. The journalist Elizabeth Kolbert has argued that humans may be in the process of causing a mass extinction event, not least due to carbon emissions. Given that humans are heavily dependent on ecosystems, both natural and artificial, for food and other resources, mass extinctions that disrupt those ecosystems threaten us as well
5) Global system collapse
The paper also mentions other possibilities, like a coronal mass ejection from the Sun that disrupts electrical systems on Earth.
That said, it's unclear whether these things would pose an existential threat. Humanity has survived past economic downturns — even massive ones like the Great Depression. An economic collapse would have to be considerably more massive than that to risk human extinction or to kill enough people that the survivors couldn't recover.
6) Major asteroid impact
The good news is that NASA is fairly confident in its ability to track asteroids large enough to seriously disrupt human life upon impact, and detection efforts are improving. Scientists are also working on developing ways to deflect asteroids that would have a truly devastating effect, such as by crashing spacecraft into them with enough force to change their path, avoiding Earth.
7) Supervolcano
Eruptions can cause significant global cooling and can disrupt agricultural production. They're also basically impossible to prevent, at least today, though they're also extremely rare. The authors conclude another Permian-Triassic level eruption is "extremely unlikely on human timescales, but the damage from even a smaller eruption could affect the climate, damage the biosphere, affect food supplies, and create political instability."
As with pandemics, the risk isn't so much that the event itself will kill everyone so much as that it'd make continued survival untenable for those who lived through it.
8) Synthetic biology
The hypothetical danger is that the tools of synthetic biology could be used to engineer a supervirus or superbacteria that is more infectious and capable of mass destruction than one that evolved naturally. Most likely, such an organism would be created as a biological weapon, either for a military or a non-state actor.
The risk is that such a weapon would either be used in warfare or a terrorist attack, or else leak from a lab accidentally. Either scenario could wind up threatening humanity as a whole if the bioweapon spreads beyond the initial target and becomes a global problem. As with regular pandemics, actual extinction would only happen if survivors were unable to adapt to a giant population decline.
9) Nanotechnology
There's also a concern that self-replicating nanotech would create a "gray goo" scenario, in which it grows out of control and encroaches upon resources humans depend on, causing mass disruption and potentially civilizational collapse.
10) Artificial Intelligence
If AI remains friendly to humans, this would be a very good thing indeed, and has the prospect to speed up research in a variety of domains. The risk is that AI has little use for humans and either out of malevolence or perceived necessity destroys us all.
11) Future bad governance
The danger is that governance structures often fail and sometimes wind up exacerbating the problems they were trying to fix. A policy failure in dealing with a threat that could cause human extinction would thus have hugely negative consequences.
12) Unknown unknowns
Source
Comments
Post a Comment