Researchers have discovered a way to overcome one of the key challenges in quantum computing – the so-called “reality gap,” which results in seemingly identical quantum devices behaving differently.
To address the issue, scientists used a “physics-informed” machine learning approach, according to Oxford University, which led the research.
The reality gap posed a “major barrier” to a wider use of quantum computing, which could “supercharge a wealth of applications,” it said in an announcement.
Some of the fields expected to benefit from quantum computing include climate modelling, financial forecasting, drug discovery, and artificial intelligence.
Researchers employed the “crazy golf” analogy to address the dilemma, according to lead scientist Natalia Ares from the Department of Engineering Science at Oxford.
“When we play ‘crazy golf’ the ball may enter a tunnel and exit with a speed or direction that doesn’t match our predictions,” Ares is quoted as saying.
“But with a few more shots, a crazy golf simulator, and some machine learning, we might get better at predicting the ball’s movements and narrow the reality gap.”
Researchers used a combination of mathematical, statistical, and deep learning methods to build a simulation model, which calculated the difference between the measured current of the quantum device and the theoretical current.
“In the crazy golf analogy, it would be equivalent to placing a series of sensors along the tunnel, so that we could take measurements of the ball’s speed at different points,” Ares explained.
“Although we still can’t see inside the tunnel, we can use the data to inform better predictions of how the ball will behave when we take the shot.”
The model provides a new method to quantify the variability between quantum devices, which could enable more accurate predictions of how devices will perform.
The study’s co-author, David Craig, a PhD student at Oxford’s Department of Materials, said the approach researchers had taken is similar to how black holes are observed. While it cannot be done directly, their presence is inferred “from their effect on surrounding matter.”
“Although the real device still has greater complexity than the model can capture, our study has demonstrated the utility of using physics-aware machine learning to narrow the reality gap,” Craig said.
The paper, entitled Bridging the reality gap in quantum devices with physics-aware machine learning, was published in Physical Review X.
More from Cybernews:
Subscribe to our newsletter