IBM made the headlines this summer when it announced its “superfridge” Goldeneye, steadily approaching its superconducting version of quantum computing. But with the company itself admitting it may not even use it in future projects, just how much of a milestone was it?
The computing giant has striven to maintain its reputation in an evolving industry, putting its first quantum processor on the cloud in 2016. Weighing in at just five qubits – a unit of measurement uniquely applied to quantum computing that counts the number of “bits” fluctuating between binary values of 0 and 1 – it paved the way for the 27-qubit Falcon, 65-qubit Hummingbird, and 127-qubit Eagle in the following five years.
Now IBM is setting its sights one notch higher, having released what it calls its “roadmap” of future quantum developments. Foremost among these will be the planned 433-qubit Osprey processor, due for release at the end of the year, followed by the 1,121-qubit Condor in 2023.
Cybernews reached out to IBM’s quantum processor and system integration manager Pat Gumann to get some more insight into the company’s take on quantum computing and where the technology is going in the near- to mid-term future.
Tell us more about the so-called superfridge IBM built – you’ve said you might not use it in the quantum processors you’re developing?
We started this project about three years ago. Goldeneye is a dilution-refrigeration system capable of reaching almost absolute-zero temperatures, 25 MilliKelvin to be precise. We can potentially go lower: it's not necessarily required for a superconducting-based quantum computing system.
Think about it like starting the car industry back in the day or the Apollo mission when we were trying to put a man on the moon. It might not necessarily be used with our quantum processors in the future, or it might be – we haven't decided. It was more of a research challenge to engage on a broader path we decided to pursue, which is to build a quantum industry. And this doesn't necessarily only mean building quantum chips and processors. It also requires building an entire auxiliary industry around that.
It starts with a quantum processor. Every single iPhone or your Mac computer, etc., has a CPU [central processing unit] inside. That's typically made on 300mm silicon-wafer technology. So you have metal layers deposited using evaporation or spotting techniques. Then there's the optical lithography mask, and it creates all sorts of circuitry so that you have billions of transistors in your CPU unit.
That’s very similar to how we build our quantum processors: it's mostly two or three layers, and it's niobium on the same silicon wafer. We don't have to reinvent the wheel. We're in the business of scaling up and building this quantum industry and eventually bringing some revenue to the company.
So what you’re describing is essentially a practical example of how quantum computing will be built on so-called classical computing?
On classical semiconductor technology: not necessarily computing because you can have semiconducting chips for other applications. Your Ring camera has a whole bunch of IC [interconnecting circuit] units in it, yet it doesn't compute. So we use silicon-wafer technology to minimize the cost and tap into this existing industry. And then we put different kinds of metal on it, different kinds of circuitry that operate at much higher frequencies. Transistors are not really high-frequency in terms of operations, they are mostly on-and-off switches.
We use microwave pulses at 5-10 GHz frequencies. Your microwave in your kitchen is about 2.5 GHz, your satellite dish is 10. We can transit much more energy and much faster because we're GHz: we're talking nanoseconds. Our quantum-computing processors are nanosecond-scale speed in terms of operations.
Once you've made that quantum computing chip, it's on a big round wafer. Next, you have to dice it: depending on the size of the quantum processor, it can be anywhere between 2.5x3cm and 10x10cm. Our latest development is the Osprey quantum processor, and that's naturally much bigger than the Eagle one we released last year. And next year, they're going to release Condor, which is physically much bigger.
As quantum processors grow in size, everything else follows. So size matters, but because everything grows bigger physically, the entire cryogenic support for our quantum chips and auxiliary microwave components has to grow as well. That's one approach. Another approach is to stop, draw a line, say we're not going to grow bigger than 500 qubits per chip – and have multiple chips connected to each other. And that's a concept we call modularity.
Talk me through how this modularity approach works a bit more.
Let's say Goldeneye can take up to 100,000 qubits. This is just a number: I'm not sure if it's going to be that or 10,000, or a million. It's not all about the quantum processor chip. It's also everything else that connects to the chip. [demonstrates a scale model of a quantum refrigerator] This is a mini dilution-refrigeration unit: this is the mixing chamber, which can reach about 10-20 MilliKelvin temperature. You have heat exchangers and vacuum lines.
And for the benefit of our lay readers, how much would 10 MilliKelvin be in degrees Celsius?
Minus 273 (-460F), so just a touch above absolute zero. Now we have our qubit chip, which is going to be mounted on to the mixing-chamber plate. Then we have to bring input/output lines – so if this is the room temperature up here, we have to have vacuum feedthrough and run co-ax cables all the way down to the lowest temperature. A co-ax cable is a semi-rigid cable made of a copper-nickel alloy: it's got an outer conductor, a dielectric in between, and a standard conductor. It's designed to transmit those microwave frequencies at very low losses.
So these co-ax cables bridge the gap between room temperature at the top of the model and the much lower temperatures you get to down below?
Correct. But in addition to that, we need a whole bunch of other microwave components: isolators, amplifiers, filters. And all of that has to sit at the lowest temperature. You'd be surprised – your qubit chip might be that big, but everything else around is actually much bigger.
In mainstream technology, we always talk about things getting smaller – for instance, the only thing that stops the cellphone from becoming tiny is the user’s need for a screen. And yet, in the field of quantum, you’re saying everything has to get bigger?
No. It's bigger because that's what actually exists today – everything has to get smaller. We're at the stage where we're creating the quantum industry. IBM is making the chips, but we also partner up with vendor companies: BlueFors Cryogenics [now known as Bluefors Oy], XMA, Arden [Technologies]. We engage in those partnerships under joint-development agreements to help them understand how to make this microwave isolator, how to miniaturize it so that we do not have to build larger fridges.
We can squeeze all of that into a commercially viable off-the-shelf cryogenics system and then take up to ten of the systems and connect them together to have this modular approach. Our approach requires MilliKelvin temperatures: there are not that many vendors building cryogenic microwave components with those. That industry does not exist yet, but it's slowly starting because IBM and many others are big players in this field.
Also, government labs: in the US, there is a lot of money being invested in quantum computing. In Europe as well, the UK has a pretty substantial budget. Everyone is getting on to this quantum train, and as we're picking up pace with it, the support industry is going to follow. But on their own, they cannot really do much because you need the quantum processor that IBM makes to characterize the performance of the new microwave components. So it has to be this symbiotic type of engagement.
Where do you see your industry going in the next few years, do you have specific goals?
I'm trying to push the limits of research and whatever we can accomplish – the superfridge is one example, then we have to take the input from the business community and craft a detailed roadmap. This has to be connected to reality: we're not the US government, we cannot embark on something like the [1940s atomic bomb] Manhattan Project or Apollo mission because that is hundreds of billions of dollars and maybe decades until you get there.
We are going to be scaling up the number of qubits as we go every single year and also thinking about how those qubits can yield actual value for our potential clients, and that's why at some point we’ll start implementingerror mitigationn on our existing quantum processors. As well as quantum error-correction schemes and all that, to build 1,000+ qubit devices, we can either go towards building bigger fridges, like we did, or take that learning and build fridges that are maybe not as big but designed to connect to each other and have this modular architecture.
My personal opinion is the entire industry is going to go towards modular design – it's just much simpler. Think about cars: you design one so that it's modular, you have the chassis, frame, and engine. Even Ford or Toyota don't make [different] engines that fit each individual model; it's the same engine in a different chassis.
I spoke to the CEO of photonic-based quantum company QCI, who said that “noise is our friend,” by which he meant his research team doesn't need cryogenics to generate the sub-space temperatures that create the stable environment quantum computers depend upon. He spoke about his company already working with thousands of qubits, error-corrected. Obviously, the context is different, but what do you say to that – is having to go through cryogenics a hindrance to IBM? What makes you have such faith in the superconducting method?
We're following the entire industry and technologyl development. Whether quantum error-correction is actually a thing right now is an open question in the research community. Every kind of quantum you implement, on whatever platform – whether it's photonic-based or superconducting – has to yield a specific result. Let's just wait for those results and see how they compare to a classical algorithm that you can run on high-performance computing-type centers.
Is some other platform better than ours? I don't know. And I encourage everyone to research every platform they can. It helps the entire industry to move forward. Whether it's photon-based, trapped ions, superconductors – we have to do everything. The money is there, whether it's the university research or venture-capital funding level – if somebody is willing to sponsor that, it helps IBM and the entire industry.
Back in the ‘70s, when semiconductor technology was being invented, we had seven or eight different approaches – based on the same 300mm-silicon-wafer technologies. And throughout the years, as the industry was developing, most of them died off. We were left with CMOS because it was the most cost-efficient and versatile. So will the superconducting qubit hit that point versus photon-based quantum computers? I think it's too early to say.
What personally drove me into the superconducting approach is that we can tap into silicon-wafer technology and borrow from the semiconductor industry – we're talking trillions of dollars of investment. For those wafers, you have to have entire foundries: those machines are very expensive, and they take about two to three years to be built. We have all of that in-house. The fundamental building block, like a classical transistor that was invented at the labs – we have it. And we have the ability to scale it up. It was a no-brainer.
Do you ever worry about secrets being hacked and stolen by state-backed threat actors and the like?
Yes, we do worry about that, it's a fair question. After all, we're developing this in the US: it's not a secret that there are other governments working on technologies like that which are not necessarily friends and maybe don't respect intellectual property, even if we file a patent and all that. So we have to be very cautious.
I try to work on research projects and keep them to the absolute minimum in terms of the people involved. And with the third-party vendors we engage with, we also have to make sure they comply with our standards in terms of the nationality of particular workers and stuff like that.
A lot of this is pretty heavy-duty industry. Jack Hidary talked about quantum paving the way for greener forms of energy, but do you get pushback in the meantime from groups worried about your carbon footprint? What steps are you taking to mitigate that, or do you not see it as an issue because developing quantum computing will be of net benefit to the environment?
High-performance [classical] computing centers use up a lot of energy. Google and other big companies: the server farms are miles and miles, and in addition to that, there is almost double the amount of storage for whatever they use to power that. The advantage of quantum computers is all the computing power can be combined into one processor. You know about the exponential scaling of quantum computing – it's not going to be good for every single application, and it shouldn't. It's not a revolution – it's an evolution of classical computing, and it will probably end up being a hybrid between classical and quantum.
But in terms of energy, a dilution-refrigeration system, the one off the shelf we can buy currently [to] host our Eagle or Osprey quantum processor, requires two cryonic compressors, each of them about 30KW in power. You have the turbo pumps and all that combined maybe 20-30KW of power, just to run the cryogenic system. Then we need electronics, and that depends on the number of qubits. It's not going to be more than a high-performance computing system, I would be surprised if it's remotely close to that.
More from Cybernews:
Subscribe to our newsletter