Can Quantum Computing Make Machine Learning Better?

March 25, 2019 - 8 minutes read

machine learning apps

There’s no doubt machine learning applications have given businesses and consumers extraordinary capabilities that weren’t possible before. But the truth is the technology is being held back by hardware; traditional CPUs and machine learning don’t exactly work well together.

Can quantum computing help machine learning developers take things a step further? A new study from IBM and MIT hints that the answer is “yes.”

A New Medium for Machine Learning

Scientists have long been captivated with leveraging the abilities of quantum computing to improve various technologies. And machine learning is no exception; algorithms in this AI subset often tax traditional CPUs far too much to be efficient. Basically, there’s huge room for improvement, particularly when it comes to speed.

And that’s exactly what quantum computing could offer. It’s a fundamentally different way of processing information compared to traditional computing. And it has unparalleled potential to expedite complex problem solving, especially when it comes to simulating nature. This potential to solve problems faster is known as “quantum advantage.”

machine learning apps

For a while, this potential to process complex problems in a fraction of the usual time was purely theoretical. But a new study from researchers and developers at Boston-based MIT and New York-based IBM show that theory may become reality sooner than we originally thought.

The First Step Toward Faster Machine Learning?

Using a two-qubit (quantum bit) quantum computing system and a lab-generated dataset, the collaboration showed that quantum devices could improve classification, a process integral to many machine learning applications. You can think of classification as the task of predicting the class of given data points. It plays an integral role in approximating a mapping function from input variables to output variables.

The team behind the research acknowledges that this project was simply the first step: “We are still far off from achieving quantum advantage for machine learning. Yet the …methods we’re advancing could soon be able to classify far more complex data sets than anything a classical computer could handle. What we’ve shown is a promising path forward.”

machine learning apps

It’s a big step—it shows that we don’t need perfect quantum computers (something currently unavailable) to improve AI. The systems we already have can do so. And if they can work side-by-side with traditional computers, even better.

United by Quirky Mechanics

The study, which was published in Nature, begins with a unique commonality: machine learning which relies on kernel methods (a class of algorithms used for pattern analysis) is extremely similar in terms of mathematics to the inner workings of quantum computers.

The most famous example of kernel methods is the support vector machine (SVM), an algorithm that was hugely popular in the 90s before deep learning stole the spotlight. Still, for problems that discriminate between simple classes, like cats and dogs, SVMs remain a powerful option.

Let’s revisit the “cats and dogs” example to see how SVMs work: Our computer starts with an image of a cat or dog. It then reorganizes the pixels based on a property like shape, color, or another characteristic that may be unfamiliar to human perception. After doing so, it “projects” the input into an abstract high-dimensional space known as a “feature map.” This map then basically functions as the blueprint that the computer uses to build a kernel, a defined way to to separate features and aid the classification process.

machine learning apps

And that’s how machines tell the difference between cats and dogs! Simple, right? Okay, maybe not so much. But when it comes to quantum computing, SVMs are a much better fit than neural networks and deep learning. “Basically, the mathematical theory of kernel methods and quantum mechanics has a lot of similarities, while the theories of quantum theory and neural networks are very unlike,” explains Dr. Maria Schuld, a researcher at Xanadu.

Schuld wasn’t involved with the study but happened to be exploring the same relationship between kernel methods and quantum mechanics at the same time as IBM.

Translating Theory to Reality

Due to their quantum properties, the particles at play in a quantum computer inhabit a large “quantum state” that is filled with various possibilities. Theoretically, this could make the process of feature separation much more efficient than when done on a traditional CPU. According to the study’s authors, this can accelerate kernel-based classifiers in two ways.

One method is to use the quantum computer as the “discriminator.” Basically, training data is mapped into a quantum state (think of it as transforming color images into a bunch of 1s and 0s). This data is then fed into a quantum circuit which mostly maintains the quantum properties until the calculation’s end. It’s not perfect, but it works.

machine learning apps

The other method relies on using the quantum computer to figure out the best way to construct a feature map from all the input data. After this is done, a traditional computer can be employed to actually use the kernel to determine whether an image is of a cat or a dog.

The team tested out both strategies with a two-bit quantum computing system and a man-made dataset. And even with all the imperfections of current-generation quantum computers, they were still able to achieve almost-perfect classification. You can actually play with a demo here!

While the experiment was as bare-bones as can be, it’s still great news for anyone itching to combine quantum computing and machine learning. “This approach to quantum machine learning provides us with a path to understanding in which way even noisy, intermediate-scale quantum computers can outperform classical machine learning algorithms,” explains Dr. Jerry Chow, IBM’s Manager of Experimental Quantum Computing.

It’s exciting to see where the relationship between quantum computing and machine learning go from here. And we’ll be eagerly watching. What do you think will become possible from merging these two technologies? Let us know in the comments!

Tags: , , , , , , , , , , , , , , , ,