D-Wave IS QUANTUM, insist USC scientists

Sceptics remain wary

Top 5 reasons to deploy VMware with Tegile

The evidence seems to be stacking up in favour of D-Wave's claims to be offering the world a kind of quantum computer, but merely exhibiting a quantum behaviour might not be enough for the company to turn the computing world upside down.

The latest news, which broke late last week, is that tests at the University of Southern California demonstrate that quantum effects appear to be at the heart of the D-Wave processor. From the release:

“The team demonstrated that the D-Wave processor housed at the USC-Lockheed Martin Quantum Computing Center behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip’s 128 qubits.

“In other words, the device appears to be operating as a quantum processor — something that scientists had hoped for but have needed extensive testing to verify.”

The USC analysis has been accepted for publication in Nature (pre-press version at Arxiv here), which means the work was at least strong enough to survive the peer review process.

According to lead author of the paper Sergio Boxio, “Our work seems to show that, from a purely physical point of view, quantum effects play a functional role in information processing in the D-Wave processor.”

However, the paper has triggered an ongoing controversy since it first appeared on Arxiv. Scott Aaronson, a long-time critic of D-Wave, told The Register that while the USC experiment found evidence for quantum annealing behaviour in the device, it also found that at the current scale, the machine under test (a 108-qubit processor) would be “outperformed on its own, native problem, by simulated annealing running on a standard laptop”.

Although it's still under test, the next iteration of D-Wave's technology, a 512-qubit machine, might still be beaten by a high-end desktop PC running a simulated annealing problem.

Aaronson noted that the enthusiastic media embrace of quantum annealing as a solution to Google's artificial intelligence problems is misplaced, saying that at the moment, the quantum annealing computer “has exactly one use: doing physics experiments to try to understand the machine”.

As noted in the USC release, the experiment only demonstrated annealing behaviour on eight of the device's 108 qubits, and herein lies another problem for those hoping to turn the D-Wave machine from a curiosity into a powerhouse.

Matthias Troyer, a professor at Switzerland's Institute of Theoretical Physics, explained the problems still facing D-Wave in an e-mail to The Register.

“First, it will have to be able to outperform classical computers as one increases the problem size. Work on investigating this is in progress. If the increase of time to solution with problem size should be slower than for classical computers then the device might be useful at some point.

“Second, the device only has a limited connectivity. Every qubit couples only to at most six other qubits. A general optimisation problem might need couplings between more (or all) variables. This means that N*N qubits might be needed to encode an N-variable optimisation problem on the device: an interesting problem with a few thousand variables might need millions of qubits.

“Third, this is an analog device and not a digital one. Like in any analog device (also classical ones), there are calibration errors, which in the D-Wave device are of the order of a few percent. Many optimisation problems require much higher precision and thus calibration will need to be improved. It is as yet unclear if sufficient accuracy can be achieved. Calibration errors were one of the main reasons why classical computing switched from analog to digital in the first half of the 20th century.”

As Aaronson wrote in his blog here, “there really has been a huge scientific advance” in characterising the D-Wave devices. “D-Wave finally has cleared the evidence-for-entanglement bar — and, while they’re not the first to do so with superconducting qubits, they’re certainly the first to do so with so many superconducting qubits,” he notes in the post. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Antarctic ice THICKER than first feared – penguin-bot boffins
Robo-sub scans freezing waters, rocks warming models
I'll be back (and forward): Hollywood's time travel tribulations
Quick, call the Time Cops to sort out this paradox!
Your PHONE is slowly KILLING YOU
Doctors find new Digitillnesses - 'text neck' and 'telepressure'
Reuse the Force, Luke: SpaceX's Elon Musk reveals X-WING designs
And a floating carrier for recyclable rockets
Britain's HUMAN DNA-strewing Moon mission rakes in £200k
3 days, and Kickstarter moves lander 37% nearer takeoff
Rosetta science team thinks Philae might come to life in the spring
And disclose the biggest surprise of Comet 67P
prev story


Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.