The nine things scientists believe could cause the apocalypse (soon), from artificial intelligence and pandemics to asteroids and supervolcanos

 
Lynsey Barber
Follow Lynsey
CHILE-VOLCANO-CALBUCO
A super volcano is just one threat to human existence (Source: Getty)

Time to hide behind the sofa. Scientists have looked at the biggest risks facing the world and ranked them according to how likely they are to bring about the apocalypse.

Imminent threats to the world as we know it include the rise of artificial intelligence, pandemics - both natural and man-made - and even an asteroid hitting the earth. That means the ending of the human race is most likely to play out like Terminator, 28 Days Later or Armageddon, if that's any consolation.

Researchers have compiled what makes for scary reading in the Global Catastrophic Risks report, detailing the events which could wipe out at least 10 per cent of the world's population.

Read more: Three things Google's Deep Mind boss said about AI

"There are some things that are on the horizon, things that probably won't happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way," said Sebastian Farquhar, director at the Global Priorities Project which collaborated on the research with the University of Oxford and the Future of Humanity Institute.

"History teaches us that many of these things are more likely than we intuitively think. Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks."

The big events which pose a threat include nuclear war, climate change and a supervolcanic eruption. Although, perhaps scariest of all are the unknown risks which could emerge over the next five years.

"There is really no particular reason to think that humans are the pinnacle of creation and the best thing that is possible to have in the world," Farquhar said on AI.

"It seems conceivable that some AI systems might at some point in the future be able to systematically out-compete humans in a bunch of different domains and if you have a sufficiently powerful form of that kind of artificially intelligent system, then it might be the case that if its goals don't match with what humanity's values are then there might be some sort of adverse consequences.

"So this doesn't depend on it becoming conscious, it doesn't depend on it hating humanity, it is just a matter of it being powerful, its objectives being opaque or hard to determine for its creators, and it being in some sense indifferent to at least some of the things we find valuable."

In terms of a pandemic, the researchers point to Ebola as an example of the need for countries to work together on global solutions, as these events hardly adhere to any sort of national borders.

And in a similar way that splitting the atom resulted in the risk of nuclear war, it's not just robot technology which creates new risks. Greater use of synthetic biology increase the risk that we'll end up with an engineered pandemic created for more sinister purposes.

Read more: Relax: Robot Godzilla won't enslave us all

"What we want to worry about in the future though is as it becomes easier and cheaper to do a lot of these things in an almost off-the-shelf kind of way, or to order the parts for say a smallpox virus off the internet, that might start to change," explained Farquhar.

"We have seen that in the field of synthetic biology and genetic manipulation of small organisms or things like viruses, the cost has come down unbelievably in the last decade. It is still too expensive to worry about rogue groups trying to use the technology, but that might not remain true"

At least this puts Brexocalypse in perspective.

Related articles