In a paper published Thursday in the journal Science, Dr. Sergey Bravyi and his team reveal that they've developed a mathematical proof which, in specific cases, illustrates the quantum algorithm's inherent computational advantages over classical.
"It's good to know, because results like this become parts of algorithms," Bob Sutor, vice president of IBM Q Strategy and Ecosystem, told Engadget. "They become part of decisions about how people will start to attack problems. Where will they try classical techniques? Where will they try quantum techniques? How will those interplay? How will they work back and forth together?"
What's more, the proof shows that, in these cases, the quantum algorithm can solve the problem in a fixed number of steps, regardless of how many inputs are added. With a classical computer, the more inputs you add, the more steps it needs to take in order to solve. Such are the advantages of parallel processing.
"The main point of this paper is not that somehow we discover some incredibly important quantum algorithm, or some practical, interesting problem," Bravyi told Engadget. "We ask if we can separate a constant depth [between] quantum and classical algorithms. As we increase the problem size, the runtime of the quantum algorithm remains constant, but the total number of operations grows."
As Bravyi points out, this new proof doesn't, in and of itself, solve any existing computational issues.
Instead, "it gives us insight into what makes a quantum computers more powerful," he continued. "And hopefully in the future it will lead to more practical, useful algorithms."
Those yet-to-be-developed algorithms won't even necessarily be designed for quantum systems, the research could lead to improvements in hybrid classical-quantum systems as well. "We can start discussing things to a much greater depth than we had to, or were able to, before. We can start to really kind of separate out for people, what goes in to all the decisions about creating quantum computers, and creating the software stack on them, and algorithms."