Thursday, May 12, 2016 by Greg White
There are a host of existential threats that could wipe out humanity, from virulent viruses to the development of artificial intelligence. According to a recent ominous report, however, government institutions are not taking apocalyptic threats seriously enough.
Climate change, nuclear winter, super volcanoes and asteroids were among the threats that pose an existential threat to humanity. Although these threats may sound like science fiction, the researchers warn that many people do not realize how probable these scenarios really are.
The report was released by the Global Challenges Foundation and the Global Priorities Project in collaboration with the University of Oxford, and rates significant threats that could kill 10 percent of the population.
Although major catastrophes, such as asteroids or super volcano eruptions, do not occur in most people’s lifetime, other plagues, like the 1918 Spanish flu, occur more often than most people think.
In an interview with the Press Association, Sebastian Farquhar, director at the Global Priorities Project, said: “There are some things that are on the horizon, things that probably won’t happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way.
“History teaches us that many of these things are more likely than we intuitively think.
“Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks.”
According to the report, the most likely catastrophes to plague humanity in the next five years include a major asteroid impact, the eruption of a super volcano and unforeseen threats. Catastrophes that threaten humanity in the long-term include climate change, nuclear war, pandemics and the development of artificial intelligence (A.I.).
Farquhar added: “There is really no particular reason to think that humans are the pinnacle of creation and the best thing that is possible to have in the world.
“It seems conceivable that some AI systems might at some point in the future be able to systematically out-compete humans in a bunch of different domains and if you have a sufficiently powerful form of that kind of artificially intelligent system, then it might be the case that if its goals don’t match with what humanity’s values are then there might be some sort of adverse consequences.
“So this doesn’t depend on it becoming conscious, it doesn’t depend on it hating humanity, it is just a matter of it being powerful, its objectives being opaque or hard to determine for its creators, and it being in some sense indifferent to at least some of the things we find valuable.”
Natural and man made pandemics, in addition to nuclear war, were the greatest threats to civilization at large, the report stated.
Mr. Farquhar commented there is no evidence that militant groups like the Islamic State will be able to cook up a deadly virus in the lab and unleash it onto the masses anytime soon, but it could happen in the far future.
“We have seen that in the field of synthetic biology and genetic manipulation of small organisms or things like viruses, the cost has come down unbelievably in the last decade.
“It is still too expensive to worry about rogue groups trying to use the technology, but that might not remain true.”
The report urges international communities to better prepare for pandemics, research possible threats posed by AI and biotechnology, and to reduce the number of nuclear weapons.
“What is really important to remember is that many of these risks don’t stop at the borders and wait patiently for their passports to be checked, they are truly global in nature,” Mr Farquhar said.
“This is not the sort of thing where one country can say ‘Oh well we are prepared and the rest of the world can fend for itself’. That is one of the things we saw with the Ebola crisis is how this thing spilled over national borders,” he added.