Feeds

D-Wave IS QUANTUM, insist USC scientists

Sceptics remain wary

Build a business case: developing custom apps

The evidence seems to be stacking up in favour of D-Wave's claims to be offering the world a kind of quantum computer, but merely exhibiting a quantum behaviour might not be enough for the company to turn the computing world upside down.

The latest news, which broke late last week, is that tests at the University of Southern California demonstrate that quantum effects appear to be at the heart of the D-Wave processor. From the release:

“The team demonstrated that the D-Wave processor housed at the USC-Lockheed Martin Quantum Computing Center behaves in a manner that indicates that quantum mechanics has a functional role in the way it works. The demonstration involved a small subset of the chip’s 128 qubits.

“In other words, the device appears to be operating as a quantum processor — something that scientists had hoped for but have needed extensive testing to verify.”

The USC analysis has been accepted for publication in Nature (pre-press version at Arxiv here), which means the work was at least strong enough to survive the peer review process.

According to lead author of the paper Sergio Boxio, “Our work seems to show that, from a purely physical point of view, quantum effects play a functional role in information processing in the D-Wave processor.”

However, the paper has triggered an ongoing controversy since it first appeared on Arxiv. Scott Aaronson, a long-time critic of D-Wave, told The Register that while the USC experiment found evidence for quantum annealing behaviour in the device, it also found that at the current scale, the machine under test (a 108-qubit processor) would be “outperformed on its own, native problem, by simulated annealing running on a standard laptop”.

Although it's still under test, the next iteration of D-Wave's technology, a 512-qubit machine, might still be beaten by a high-end desktop PC running a simulated annealing problem.

Aaronson noted that the enthusiastic media embrace of quantum annealing as a solution to Google's artificial intelligence problems is misplaced, saying that at the moment, the quantum annealing computer “has exactly one use: doing physics experiments to try to understand the machine”.

As noted in the USC release, the experiment only demonstrated annealing behaviour on eight of the device's 108 qubits, and herein lies another problem for those hoping to turn the D-Wave machine from a curiosity into a powerhouse.

Matthias Troyer, a professor at Switzerland's Institute of Theoretical Physics, explained the problems still facing D-Wave in an e-mail to The Register.

“First, it will have to be able to outperform classical computers as one increases the problem size. Work on investigating this is in progress. If the increase of time to solution with problem size should be slower than for classical computers then the device might be useful at some point.

“Second, the device only has a limited connectivity. Every qubit couples only to at most six other qubits. A general optimisation problem might need couplings between more (or all) variables. This means that N*N qubits might be needed to encode an N-variable optimisation problem on the device: an interesting problem with a few thousand variables might need millions of qubits.

“Third, this is an analog device and not a digital one. Like in any analog device (also classical ones), there are calibration errors, which in the D-Wave device are of the order of a few percent. Many optimisation problems require much higher precision and thus calibration will need to be improved. It is as yet unclear if sufficient accuracy can be achieved. Calibration errors were one of the main reasons why classical computing switched from analog to digital in the first half of the 20th century.”

As Aaronson wrote in his blog here, “there really has been a huge scientific advance” in characterising the D-Wave devices. “D-Wave finally has cleared the evidence-for-entanglement bar — and, while they’re not the first to do so with superconducting qubits, they’re certainly the first to do so with so many superconducting qubits,” he notes in the post. ®

The Essential Guide to IT Transformation

More from The Register

next story
Just TWO climate committee MPs contradict IPCC: The two with SCIENCE degrees
'Greenhouse effect is real, but as for the rest of it ...'
Asteroid's DINO KILLING SPREE just bad luck – boffins
Sauricide WASN'T inevitable, reckon scientists
Flamewars in SPAAACE: cooler fires hint at energy efficiency
Experiment aboard ISS shows we should all chill out for cleaner engines
Boffins discuss AI space program at hush-hush IARPA confab
IBM, MIT, plenty of others invited to fill Uncle Sam's spy toolchest, but where's Google?
NASA Mars rover FINALLY equals 1973 Soviet benchmark
Yet to surpass ancient Greek one, however
Famous 'Dish' radio telescope to be emptied in budget crisis: CSIRO
Radio astronomy suffering to protect Square Kilometre Array
BEST BATTERY EVER: All lithium, all the time, plus a dash of carbon nano-stuff
We have found the Holy Grail (of batteries) - boffins
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.