This article is more than 1 year old

IBM demos quantum computing

Atoms correctly factor fifteen

Well it's a start, anyway: IBM übergeeks have managed to factor fifteen with mere atoms, using an algorithm for quantum factoring by mathematician Peter Shor, the journal Nature reports.

The basic idea of quantum computing is to store data in the nuclei of atoms by altering their orientation, thereby producing a binary scheme. Because the fundamental elements of computation here are discrete, they can operate simultaneously and in parallel, potentially making computers of fantastic power.

A little test tube of liquid was targeted with a nuclear magnetic resonance spectrometer in a sequence according to the Shor algorithm, and computing history was made at the IBM Almaden Research Center in San Jose, California.

Now that this has been accomplished on an admittedly tiny scale, it's at least theoretically possible to devise a machine which could perform calculations in hours which current computers would need centuries to complete.

The practical challenges are obviously enormous, possibly insurmountable, so don't look for working quantum computers to turn up any day soon.

"This method of using nuclei to store quantum information is in principle scalable to systems containing many quantum bits, but such scalability is not implied by the present work," the researchers say.

But it is awfully cool, we must confess. ®

Related Link

The abstract

More about

TIP US OFF

Send us news


Other stories you might like