For seventy years, nuclear fusion has always, and only, been the next big thing. Its promise is tantalising: near limitless, zero-carbon energy. Unlike fission, its cousin which powers today’s nuclear plants, it also leaves no long-lasting radioactive waste.
We have long known it is theoretically possible. In the most over-simplified terms, two light elements (like hydrogen) are smashed together, releasing a huge burst of energy. We know this can be done because without nuclear fusion, none of us would be alive. All the light and heat of the sun is generated this way.
Down here on earth, we have generated energy this way before, but only with limited success. Critically, the energy used to power a nuclear fusion experiment has never been more than the energy generated by one.
That was, at least, until this week. First reported in the FT, and since confirmed by the US government, researchers at the National Ignition Facility at the Lawrence Livermore National Laboratory in California have found their holy grail. A man-made fusion experiment has, for the first time ever, given more than it has taken. 2.1 megajoules of energy powered a fusion reaction. 2.5 megajoules of energy was generated as a result.
Caution is required. That gain of 0.4 megajoules is only enough to boil a few kettles. We are still many years away from generating energy at scale in fusion power plants.
But this does appear to be a Promethean moment, and it should be celebrated. This is not only true because of what it prophesies for our future, but also because of a more prosaic lesson it teaches us about how great scientific breakthroughs happen.
The US government has been funding research into nuclear fusion since the 1950s, now spending some $700 million each year. This investment, which once looked quixotic, now looks far-sighted.
Investments like these are indeed vital to our future, and the role of the state is critical. No private company could possibly sustain investment like this for so long and for so little immediate reward. It is only now, building on seventy years of state funding, that private sector investment is crowding in. In the last twelve months, fusion companies raised a further $2.83bn to push fusion to the finish line.
The United States is often lauded for its culture of private-sector investment, while to travel on America’s pitiful public transport is to witness a paucity of good, everyday public-sector spending. Yet, when it comes to long-term research, the Americans have long understood the critical importance of government backing.
This can be dated to a moment in 1957 that terrified American policymakers. That year, the Soviet Union launched the first ever satellite to orbit the earth, Sputnik I, catching the Americans unprepared. In 1958, the US government launched its response: the Advanced Research Projects Agency (ARPA), since renamed DARPA (the “D” standing for “Defence”).
DARPA’s initial mission was to place a man on the moon, an achievement that took just eleven years, before adding defence to both its name and mission.
In the process, it funded breakthroughs that have made the modern world, not least the development of the internet. Long after man first stepped on the moon, DARPA continues to fund long-term research.
DARPA’s moon-shot investments won the space race. Now the US government’s investments in nuclear fusion could be the earth-shot required to win the race for clean energy too.