From accurate weather forecasting to faster drug development - quantum computers promise an exciting future. But there are problems we have to tackle along the way.
IonQ recently secured a $13.4 million contract to supply the US Air Force Research Lab with quantum technologies to help protect public and private infrastructure in the US.
Chris Monroe, Co-Founder and Chief Scientist at IonQ, said the field just fell into his lap. He had worked with atomic clocks at NIST almost two decades ago.
“We were just entangling atoms to make the clocks work better. Well, it turned out we were making a tiny quantum computer with quantum logic gates that makes entanglement,” he told Cybernews.
I sat down with him to discuss where quantum computing research is heading, what problems might arise, and how far in the future can the mysteries of “quantum” allow us to glimpse.
What’s going on with quantum computing?
One good news is that we have already experienced the classical computer revolution in the last 70 years. There are a lot of parallels to what's going on in quantum.
Even before we had regular computers, we learned that the concept of a computer is independent of any hardware, right? You can use a computer, you can use an abacus, you can use vacuum tubes, you can use silicon transistors, and you can store data on cassette tapes, hard drives, and flash memory.
It doesn't matter what the hardware is as long as you have hardware that behaves according to this new physics behind computing. But the other exciting thing is that as the hardware was built in the 1950s and 1960s, in the integrated circuit, nobody in their wildest dreams would have imagined you could pack 10 billion transistors on a small square-inch chip and use it for absolutely everything. Now we take photographs with ten megabytes of memory per picture, and that's way more than we need. We can afford to be lazy because memory is cheap.
Now, we're not there yet with quantum, it's not yet a commodity where we can do that, but we're sort of in the 1950s or early 1960s with quantum, and nobody can predict exactly how big these things will scale. We have hints about certain problems. They'll be useful for it. And again, that was precisely the same case in the 1960s.
The first integrated circuit, the application for that, was a miniature amplifier for hearing aids so that you didn't have to carry around a big bulky piece of equipment. You could put it in your pocket. The silicon transistor was much smaller than a vacuum tube. During World War II, computers were developed for the war effort to calculate the trajectories. Those are very important applications. We tend to ignore those applications now.
It's almost impossible to overstate where quantum computers will have applications. It is also very hard to predict precisely when the hardware will be ready. It's pretty exotic.
We have to learn how to program what the native hardware can do. There's a word that we borrow from high-performance computing – co-design.
You get the software and applications engineers together with somebody like me who is a hardware engineer at the very bottom, you know, making the transistors and the individual quantum bits or qubits. We need programmers to know exactly how the hardware works so they can make very efficient code. It's really important to do that.
You are saying that we also need to rethink that software part, too?
That's right. This is what's different about quantum. What do we need better computers for? Well, I guess we always need better computers. If you want to calculate the weather more precisely, that takes more memory.
I don't want to say a quantum computer will calculate the weather more precisely, but there are certain types of problems that involve vast amounts of data that quantum computers can naturally tackle. These are problems that we tend to ignore, like really big optimization problems.
Let me give you an example. Any logistics company delivers millions of packages every day. They know where the stuff is and where it needs to go. They have to figure out the optimal way to get the package to the address while minimizing the miles traveled and the energy consumed.
That's a huge logistics problem. And there are so many configurations that a classical computer can't do. They make approximations, they guess.
To squeeze all the efficiencies out of this and make it cost 10% less – that's huge. That's billions of dollars a day. They can't do that on a regular computer. It's precisely these types of problems that quantum computers may be able to tackle.
The key to a quantum computer is that it can deal with it. It can deal with multiple pieces of data at the same time with the same device.
There's a reason why it works for optimization and not everything.
The answer has to be very simple. In logistics, an optimization problem, the answer is simple. Here's the map. This is where that packet should go. Forget all the other paths. There are a gazillion paths, but there's only one optimum path. And it's problems like that that quantum computer is good at. So anything that has the word optimization in there, quantum computing has great promise.
Tell me about IonQ quantum computer. How is it different from others since there are quite a few?
I got into this field more from an academic standpoint almost 30 years ago. I worked on the atomic clock for The National Institute of Standards and Technology (NIST). These atomic clocks are individual isolated atoms. They're not part of a surface, and they're not part of a solid. They're isolated in space, and there's no air either.
They're perfectly isolated and replicable. If I give you an atom of a certain type, a certain isotope, they are the same. That's what a clock needs to be. You need to have a standard. If I tell you that my clock is based on an atom of Cesium 133, we can replicate that with perfection.
So, we were making better atomic clocks. It sounds narrow, but it was a research-based group, and we needed to entangle these atoms. That's the buzzword in quantum. Back then, we didn't know what a quantum computer was. We were just entangling atoms to make the clocks work better. Well, it turned out we were making a tiny quantum computer with quantum logic gates that makes entanglement. The field fell in our laps.
We happened to be in the right place at the right time. For those five years or so, we were perfecting how you wire together individual qubits. In our case, those quantum bits are individual atoms, which is pretty exotic when it comes to a computer.
You don't think of a computer as having individual atoms. We poke laser beams at them in a little vacuum chamber. My colleague from NIST, Dave Wineland, won the Nobel Prize partly based on that work in 2012.
Recently, we had another Nobel Prize in this area having to do with entanglement, some foundational work that we definitely contributed to that with our atoms [The Nobel Prize in Physics 2022 was awarded to Alain Aspect, John F. Clauser and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science."]
After 20 years of research in laboratories, me and Jungsang Kim [Co-Founder of IonQ] saw it was ready to translate into a commercial product. Kim is more of an engineer, and I'm more of a physicist. Together we cover the science and engineering behind this product now.
What's unique about IonQ is that we were the first company to get out there and not build a quantum computer out of the usual silicon or solid-state, even superconductors. That's not a solid state. These are individual atoms.
Unlike classical computing, the problem in quantum computing is if you make the individual transistor or qubit and it's not exactly the same, then your errors propagate. It's problematic.
A quantum system has to be isolated for it to work. Individual atoms in a vacuum chamber trapped above a chip is perfect isolation.
We have all of the basics needed for a quantum computer. It comes down to an engineering vision, how to bring in products and integrate very fancy optical controllers, and things like this. We're not worried about physics anymore. That's done.
That's not true of the other superconducting and solid-state technologies. They have a lot of breakthroughs they need to think about scaling. They don't know how. Nobody knows how to scale in those systems.
We know exactly how to scale. It's just going to be expensive.
Do you think that quantum computers and classical ones will co-exist?
I don't think quantum will replace what classical computers do now. We need good conventional computers just to run quantum computers.
There are problems that classical computers are probably as good or better than quantum computers. If you have a table where each input gives an output, a classical computer is better.
A quantum computer is more like a funnel. It takes many more inputs, but it produces one output. It will complement the problems that we tackle as a society.
However, we don't yet do those problems. So it's like the 1960s. We don't know how many problems are going to come our way. Autonomous vehicles have lots of inputs. They have to optimize. How do I get to the grocery store without getting in a wreck and manipulating all the possibilities coming our way?
Classical computers will always be there for certain types of problems. Right now, we store data in a different device than we compute. Quantum computers, I think, in the long run, will sort of be another part of the computational system.
Recently, your company has shared some big news. For example, you secured a contract to provide quantum solutions to the United States Air Force Research Lab. Do you want to elaborate on this particular partnership or any other projects you find the most interesting?
IonQ started operating in about 2016. By now, we've built six generations of quantum computers, and three more are on the way. Each one is getting more and more powerful.
Over the last two years, we got some exciting opportunities. One, we have an applications team. As I said, they're trying to deploy our systems on real-world problems and working with customers.
You know, Hyundai's very interested in designing new solid-state batteries. What does that have to do with quantum? The solid-state material useful for batteries is a highly complex quantum problem to study how materials behave.
One surprise has come from the financial sector. Finance teams from all the banks in New York, even governments worldwide, have big logistics problems. They like to model the economy. I mean, that's as difficult as modeling the weather. There are so many indicators. There are so many metrics that go in, for example, the Dow Jones index.
That's one application area we're quite high on right now – financial optimization. These banks have full-time people that think 100% of the time about how quantum computing can help their business.
You mentioned the Air Force. [...] People want those machines [quantum computers]. They don't want just to run them on the cloud. They want them in their building. We should have anticipated that.
We're starting to spin up a manufacturing division of our company that will stop building prototypes and starts building multiple versions of the same computer. It's like an assembly line in an auto plant. Instead of building a custom car, you make lots of the exact vehicles.
When you do that, you're freezing the design. We're going to expect some pretty high performance by doing that. We're not ready to do that now. Give us a year or two, and we'll be able to deliver these. Air Force is one such customer that wants to have a quantum computer on their laboratory grounds.
Air Force, by the way, is very interested in networking, and that's part of our scaling plan – to network computers together.
You said that you had already built six generations of that computer. I imagine that's not something that Air Force can just buy and then keep using for 20 years. This is still a very much developing field. So how does this work?
That's tricky. Of those six generations, two generations of them are pretty much parallel. They're on the cloud at Amazon, Microsoft, and Google.
The tricky thing about putting a computer on the cloud, much less delivering it, is that it must run all the time. You're not improving it.
When we started the company, all we wanted to do was improve things. We want to get in there and get it to work better and better. You have to stop. That's a significant challenge. We have to agree to stop improving it and put it on the cloud, which is the bad side of things.
The good side is that we've learned how to make a machine that runs autonomously. It runs nearly 24/7 without any maintenance. That is hard in quantum. By putting our systems on the cloud, we know how to do that. That's going to help us when the time comes to deliver real products.
It's a big deal delivering hardware to Air Force. The good thing about Air Force is that there are very good people on the ground who understand how the system works. That's the point of the agreement. It would be harder to sell it to anybody at this point. But Air Force is a perfect partner.
Eventually, we will have black boxes that will go around, but not this year. It's challenging to decide when to stop and deliver and when to keep building. We have to do both. We need users to understand how our systems work, even though they're not yet big enough to break codes or something or to solve all of Amazon's problems.
More from Cybernews:
Subscribe to our newsletter