Quantum computing and machine learning combo


A recent study shows that even small-scale quantum computers can outperform classical ones in machine learning tasks, offering not only higher speed but also potentially lower energy consumption.

An experimental study has demonstrated that small-scale quantum computers can already boost the performance of machine learning algorithms, paving the way for new applications for optical quantum computers.

University of Vienna researchers have published a mind-boggling study in Nature Photonics about the combination of quantum computing and machine learning, opening a new research line dubbed quantum machine learning.

ADVERTISEMENT

“This field aims at finding potential enhancements in the speed, efficiency or accuracy of algorithms when they run on quantum platforms. It is however still an open challenge, to achieve such an advantage on current technology quantum computers,” the university said.

Once here, quantum computers will be able to perform millions more calculations than classical ones, essentially meaning that machines will be able to learn things faster.

vilius Gintaras Radauskas Ernestas Naprys Paulina Okunyte
Be the first to know and get our latest stories on Google News

Quantum computers are still far from being fully operational, as current systems heavily depend on environmental conditions, such as temperature and magnetic fields, making them unstable.

However, this new research shows a promising future for the machine learning and quantum computing combo. Apparently, you don’t even need anything beyond state-of-the-art technology.

The novel experiment features a quantum photonic circuit that runs a special machine learning algorithm.

"We found that for specific tasks our algorithm commits fewer errors than its classical counterpart", explains Philip Walther from the University of Vienna, lead of the project.

What is also important is that photonic platforms may consume less energy than classical computers.

ADVERTISEMENT

"This could prove crucial in the future, given that machine learning algorithms are becoming infeasible, due to the too high energy demands", said co-author Iris Agresti.