Supercomputers and nuclear testing: what’s really going on?


Exploring the role of supercomputers in modern nuclear deterrence.

Just as Robert Oppenheimer, the genius who led the creation of the atomic bomb, struggled with guilt over its devastating impact, the system behind El Capitan—the world's fastest supercomputer—seems to remain unfazed by similar ethical dilemmas.

While Oppenheimer expressed deep remorse about the legacy of the Manhattan Project, El Capitan powers forward, advancing nuclear simulations with astounding speed and precision.

ADVERTISEMENT

The ongoing cultural frenzy surrounding Oppenheimer, long after the release of Barbie, has reignited debates about the moral implications of nuclear advancements.

Christopher Nolan's decision to delve into the topic has brought renewed attention to modern nuclear developments, especially as they intersect with technologies like supercomputing.

The virtual shift: from physical tests to simulations

As The New Scientist recently highlighted, El Capitan is designed to simulate nuclear testing, performing 1.742 quintillion (18 zeros) calculations per second.

Its role in the US nuclear deterrent system demonstrates how nuclear testing now takes place in virtual simulations instead of real-world detonations. But this raises an important question: is this merely an evolution of Oppenheimer’s Manhattan Project?

In theory, regular simulations like these are meant to reduce uncertainties about weapon reliability, ensuring they are well-maintained and understood.

However, this focus on maintaining US stockpile – mirrored by similar efforts from Russia and China using their T-platforms and Tianhe supercomputers – raises another question: in a world of evolving superpowers, can these simulations truly account for the global scope of nuclear threats and deterrence?

ADVERTISEMENT

From Nevada to El Capitan: the shift in nuclear testing

The Nevada Test Site, as depicted in the Oppenheimer movie, was acquired in 1951 and, until 1992, was a haven for full-scale nuclear testing. However, after the 1996 Comprehensive Nuclear-Test-Ban Treaty (CTBT), physical testing was banned.

El Capitan, following in the lineage of Cray Supercomputers like Frontier and Sierra, commits to “simulation fidelity” – staying as true to reality as possible.

There are certainly parallels here, as today’s geopolitical arenas echo the race to nuclear supremacy that began in 1942. Oppenheimer was brought into the Manhattan Project, which can be viewed as the first large-scale multidisciplinary project requiring vast computation – just like today’s nuclear simulations.

Known as “the father of the bomb,” Oppenheimer’s legacy remains complicated. As his calculations gained traction, he became embroiled in guilt after the bombs were dropped on Hiroshima and Nagasaki.

Ernestas Naprys Niamh Ancell BW Gintaras Radauskas jurgita
Stay informed and get our latest stories on Google News

The uncertainty of human behavior in supercomputing

While these supercomputers are incredibly powerful and can simulate a wide range of scenarios, human unpredictability remains a significant factor. Political and military leaders can act irrationally or unexpectedly, which can’t always be captured by simulations.

ADVERTISEMENT

It’s hard to gauge whether Oppenheimer would have supported the sheer advancement of weapons themselves or felt as pensive as Christopher Nolan depicts him when the tectonic political plates could well collide.

A common misconception that needs addressing is that supercomputer testing is the same as creating or detonating new bombs. It’s not. These simulations are about ensuring that existing stockpiles remain functional and safe.

But how do we measure deterrence? With ethical dilemmas surrounding AI, can we truly put faith in supercomputers to protect the future of the human race?

The future of AI, quantum computing, and nuclear simulations

The future of AI and quantum computing, as discussed on podcasts like The Lex Fridman Show, adds speculative complexity. Just like the noise that builds up as the film Oppenheimer crescendos, the sheer noise of AI could amplify these issues louder and louder.

Advocates of virtual nuclear arms testing may point to the environmental benefits – no emissions from detonation – but this doesn’t fully address the complexity of potential consequences.

While Oppenheimer’s Manhattan Project directly shaped today’s nuclear landscape, the current focus on simulation is a different realm. Direct testing ended in 1992, and now, supercomputer testing seems as distant from that era as the different Batmans in the DC franchise – happening in different worlds.

And it begs the question: if El Capitan can perform such quintillions of calculations per second, could this computational power be better harnessed for other global challenges, such as climate change, medical research, or, most strikingly, nuclear disarmament?

ADVERTISEMENT