Feeds

NASA and Google team up to buy into quantumish computing

Hoping to crack machine-learning conundrum

The essential guide to IT transformation

A consortium of researchers from Google and NASA are planning to crack the issue of machine learning with a $15m quantum computer that will form the basis of a new Quantum Artificial Intelligence Lab.

The new facility, which will be sited at Silicon Valley's NASA Ames Research Center, will host a 10 square meter shielded room which will contain a D-Wave quantum processing machine – a superconducting 512-qubit processor chip cooled to 20 millikelvin which will be upgraded to 2,048 qubits once the hardware becomes available.

"We believe quantum computing may help solve some of the most challenging computer science problems, particularly in machine learning," said Hartmut Neven, Google's director of engineering, in a blog post.

"Machine learning is all about building better models of the world to make more accurate predictions. And if we want to build a more useful search engine, we need to better understand spoken questions and what's on the web so you get the best answer."

As El Reg has pointed out, the D-Wave machine isn't, strictly speaking, a quantum computer of the type that could be used to crack the world's encryption systems. Rather it uses quantum effects to massively speed up the processing and optimization of data.

Neven said that Google has already tried some machine-learning problems out on the machine and has developed a compact, low-power recognizer for mobile phones and sorting through some highly polluted datasets.

Meanwhile, NASA wants to use the system to examine data from the potentially doomed Kepler telescope to find exoplanets across the universe. In addition, 20 per cent of the system's runtime will go to the Universities Space Research Association for other tasks.

"The order for a D-Wave Two system for the initiative launched by NASA, Google and USRA attests to the revolutionary potential of this fundamentally different approach to computing for both industry and government," said Steve Conway, IDC research vice president for high-performance computing.

"HPC buyers and users are looking for ways to speed up their applications beyond what contemporary technologies can deliver," Conway said. "IDC believes organizations that depend on leading-edge technology would do well to begin exploring the possibilities for quantum computing." ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.