The UK needs a long-term vision for supercomputing

The Spring Budget’s commitment to put the UK at the forefront of supercomputing was a welcome move to deliver on the Prime Minister’s vision of cementing our place as a “science and technology superpower by 2030”.

Headlines are welcome, but we have to wonder why it took us so long to figure out what is blindingly obvious to the US, Germany, China, Singapore and Australia (working together ) and Japan, which have spent the past decade planning the exascale supercomputer revolution. After all, the ambition for the scientific superpower was first articulated by Boris Johnson in 2019.

For a dozen years there has been a damning case for an exascale supercomputer in the UK, the nation that paved the way for computing thanks to pioneers such as Charles Babbage, Ada Lovelace and Alan Turing. Exascale machines are capable of a billion billion (10 to the power of 18) operations per second and are five to a thousand times more powerful than the older petascale generation.

Due to the rapid interconnection between the thousands of computers (nodes) they contain, supercomputers will always outperform cloud computing in addressing pressing global challenges – for example, creating digital twins of factories, reactors, fusion, of the planet and, in our work, even cells, organs and people. These models can accelerate energy, climate and medical research, respectively.

A recent report on “The Future of Compute”, published this month, pointed out (as have previous reports) that the UK lacks a long-term view. This is why budget commitments to exascale (as well as quantum and AI) are extremely welcome. The science and technology framework developed by the new Department of Science, Innovation and Technology also promises a more strategic vision.

But judging by the world’s first exascale machine – the US Frontier – a British supercomputer will cost at least half a billion pounds. The same investment will again be required to serve it – the software must be customized for high-performance computers. And we will also need new skills, which will take time and sustained funding to develop.

The budget document’s headline figure of £900m for “peak computing power” seems to fit the bill, but it’s billed for both exascale and AI. It’s unclear if the government is considering a separate AI initiative or just hinting that an exascale machine that uses processors called GPUs (graphics processing units) is much better suited to AI than existing machines in the world. United Kingdom.

Above all, we hope the government has learned the lessons of recent history. The UK’s lack of long-term vision became painfully clear in 2010, when the Engineering and Physical Sciences Research Council (which manages supercomputing in the UK) announced that there would be no national supercomputer after 2012.

One of us – Peter Coveney – led the charge to challenge that claim. In response, then-Science Minister David Willetts released around £600 million for supercomputing. But when he quit, the momentum was lost and the UK went ahead with Archer and Archer2 in Edinburgh: GPU-less tech cul-de-sacs.

Last November, the UK held just 1.3% of global supercomputing capacity and had no supercomputers in the top 25. Until the latest announcement bears fruit, UK researchers will not have direct access to the exascale machines being developed. by the European Union; the first, Jupiter, will start operating in two years.

Micromanagement by generalists has clouded our view of the future. You have to lobby the government, petition, make special cases, and repeat this process over and over again. The budget says exascale investment in supercomputing is still “subject to normal business processes.”

Because of the short-sightedness of these processes, we end up with a series of impulsive, sticky policies designed with headlines in mind, rather than a long-term vision. This creates a complex, underpowered, and fragmented IT ecosystem, as the compute report makes clear. To address these failings, Sir Paul Nurse argues in his recent review of the UK’s research and innovation landscape that government “must replace frequent, repetitive, multi-level reporting and auditing…with a culture of trust and deserved trust”. Academics need to be empowered to make big decisions about the necessary investments.

In addition to spurring research in areas as diverse as climate prediction, drug development, fusion, and the large linguistic AI models currently making headlines, exascale computers will spur the development of more energy-efficient, in particular thanks to analog computing and bootstrap developments. in AI, semiconductors and quantum computing.

Yes, the price will be high and an exascale machine will be difficult to deliver by 2026, as the recent review recommends. But please don’t make this another fix and forget policy. It’s already time to start thinking about a national zetta-scale supercomputer.

Peter Coveney is director of the Center for Computational Science at UCL and Roger Highfield is scientific director of the Science Museum. They are co-authors of Virtual You: how creating your digital twin will revolutionize medicine and change your lifepublished on March 28.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
%d bloggers like this: