Superforecasting: How seemingly ordinary people can predict everything from world events to election results

 
Melissa York
Follow Melissa
Superforecasting looks nothing like this, and superforecasters would likely resent the implication.

The President of the Democratic Republic of the Congo was having a stressful time in 2013. A rebel militia had just taken control of Goma, a large province with a population of about a million people, and regional powers were putting pressure on him to wrest it back by force.

Little did he know that Michael Story, a policy researcher sitting at his computer 6,500 miles away, had guessed what he was going to do before he’d even decided. Story, you see, harbours a rare talent for predicting global events. He’s part of an elite club of “superforecasters”.

“I was glued to it,” he said. “The other forecasters were predicting a 90 per cent likelihood that he was going to attack this group. They didn’t think he had it in him to open peace negotiations, but that’s exactly what he did, which is what I’d backed him to do.”

Read more: Bioengineers have created a “psychic robot”

Story didn’t have any connection to the Congolese conflict, in fact, he’d only started learning about the situation weeks beforehand. This is normal for the thousands of volunteers around the globe who spend hours raking through news reports and government data to answer questions like “Will Donald Trump be the Republican candidate for President in 2016?” and “Will a woman be appointed to be the next secretary general of the UN?” And they do it all in the name of fun.

Ban Ki-Moon waiting patiently for a woman to take over as secretary general of the UN

It started back in 2011 when political scientist Philip E Tetlock teamed up with psychologists Barbara Mellers and Don Moore (who specialise in decision-making and overconfidence, respectively) to set up the Good Judgement Project (GJP). It aimed to harness the wisdom of crowds to predict world events, calling upon volunteers from around the globe to take part in a forecasting tournament, which meant using their judgement and powers of analysis to answer over 500 geopolitical questions. These were set by Intelligence Advanced Research Projects Activity (IARPA), a US agency that supports research into improving intelligence services.

Around 20,000 amateur forecasters signed up and, after a prolonged testing period, the cream of the crop formed an elite team. These are the superforecasters – known simply as “supers” in the prediction community – and there are approximately 150 of them in the world.

But how can you measure how good someone is at predicting the future? To sort the supers from the punters, the GJP uses the Brier Score, which measures the gap between forecasts and reality; 2.0 is the score you’d get if the exact opposite of your prediction happened; statistically, 0.5 is what you’d get by randomly guessing; and 0 is the bull’s eye.

Read more: Ian McKellen has made a Shakespeare app to bring the Bard to a new generation

Unlike most pundits, forecasters constantly update their predictions when news comes in, so if volcanologists report deformities on the surface of Vesuvius, for example, then they can adjust their forecast to say it’s 70 per cent likely to erupt, rather than 20 per cent.

This may sound like cheating, but the score also takes into account how quickly you arrived at the correct forecast, so someone who guessed that we’d leave the EU in November would score more highly than someone who called it a week before the referendum.

Each forecaster is then given a Brier Score and the top two per cent are invited to become supers. Story became a super around a year after he started forecasting and when he travelled to California for a get-together, he noticed some striking similarities with other supers.

“I made a good friend there who said it was like that bit at the end of ET where he goes back to his home planet and meets all the other ETs. We came from different countries, backgrounds, areas of expertise, but our personalities were very much the same. When I went to find people on Twitter, I found that I was already following some of them without realising it.”


E.T. has loads in common with superforecasters

Unsurprisingly, considering the founders’ backgrounds, the GJP employed psychologists to follow the supers’ progress to see what average folks like us could learn from them. They found that people who were good at forecasting were fairly intelligent, but not Mensa candidates.

Essentially, they possessed a healthy amount of cynicism that led them to ask the right questions and weigh up data fairly, and an open mindedness that allowed them to change their minds easily when the facts seemed to be contradicting their forecast.

But what’s in it for the supers? Initially, it was a $250 Amazon voucher, but the team has proved so successful the GJP now has a commercial arm to make forecasts commissioned by the US government and multinational corporations. A bestselling book followed, “Superforecasting: The Art and Science of Prediction”, and fans include bigwigs in the Vote Leave campaign, such as Michael Gove, who was spotted leaving Downing Street a few weeks ago with a copy tucked into his man-bag.

Read more: The definitive list of everyone who has ever expressed an opinion on Brexit

The tournaments are also a great intellectual workout that offer a level of self-analysis rarely seen outside Scientology centres. This is what motivates Chris Davies, a strategy and management consultant who, along with Story, is one of only six supers living in the UK.

“It’s so easy to make sweeping statements about stuff – that Trump will never get the Republican nomination, or that a hung Parliament is definitely the outcome of the general election. It’s much more fun to put your opinions to the test,” he says. “What the GJP did differently was quantify those assessments, so you could judge how accurate your estimates were.

“Providing an inconclusive answer would generally prompt a response like President Johnson’s to his economist: ‘Ranges are for cattle, give me a number!’”

For years, intelligence services around the world used words such as “likely”, “significant chance” and “probably” when reporting to politicians, then waited to see if events unfolded in their favour. Subsequent studies have shown you couldn’t devise a vaguer system; words such as “likely” might mean 50 per cent to one person and 80 per cent to another.

At GJP, forecasters express their predictions numerically, so they look something like this: “Superforecasters’ aggregated probability of ‘Leave’ winning the EU referendum – 11 per cent.” That’s a real prediction, by the way, taken from the @GJ_Analytics Twitter account, which publicly updates the project’s Brexit forecast every day.

This is the exact face Michael Gove pulls every time he feels a forecast coming on

In “Superforecasting”, Tetlock says this shift to numerical expressions of prediction was the direct result of the catastrophic intelligence report that led to the war in Iraq. “Only after the debacle over Saddam Hussein’s supposed weapons of mass destruction, and the wholesale reforms that followed, did it become more acceptable to express probabilities with numbers.”

According to reports, when Barack Obama asked his CIA analysts whether the man in the Pakistani compound was Osama bin Laden, they told him they were between 70 and 90 per cent sure, a definitive victory over the vague language of past predictions.

Read more: I spent the entire day on a sofa with only my smartphone to feed, wash and entertain me

You might assume that intelligence analysts who have expertise in military strategy and access to classified information would have the advantage over a bunch of amateurs with sound logic, but the tournaments have shown that’s not the case. An official IARPA report found that the superforecasting team were 40 per cent more accurate than the professional control team, despite having little or no prior knowledge of the geopolitical climate surrounding the question they were asked.

One of Davies’ proudest forecasting moments, for example, was when he got together with a couple of other supers to develop a projection model for the impact of terrorist attacks in Nigeria, purely through mining available data on the subject, while Story says his Congolese forecast was good precisely because he was distanced from it.

Read more: In praise of solitude – what does time alone mean to you?

“It shows that it’s easier to be dispassionate because you start to take it personally when it’s your country and stick to your old prejudices. This year, it’s going to be particularly interesting with the US election, to see whether supers living in the US will get it more right than supers living abroad.”

In “Superforecasting”, Tetlock writes boldly about wanting to improve what he sees as the bloated, expensive – and not terribly accurate – intelligence apparatus that advises our politicians and drives global affairs. For Davies, meeting like-minded people to solve geopolitical puzzles is enough, but for others, like Story, it’s the beginning of a new dawn in intelligence.

A year ago, he set up a company called Brier Consulting and holds workshops in London to train others in the art of forecasting, a skill he thinks will be increasingly sought-after in business and politics.

“People who are good at forecasting in one area tend to be good at forecasting in most areas,” he says. “It’s a sort of general ability that you can apply to a wide range of things.”

After the Iraq War, the unexpected financial crash and last year’s general election polling disaster, the scene is set for an alternative way of doing things. But can superforecasting succeed where intelligence has failed? That might be the only question they can’t answer with certainty.

Superforecasting: The Art and Science of Prediction by Philip E Tetlock & Dan Gardner is out now on Cornerstone Digital

Related articles