Boffin melds quantum processor with quantum RAM

Qubits meet John von Neumann

Security for virtualized datacentres

A California physicist has created a quantum-computing chip based on the CPU/RAM combo known as the von Neumann architecture, opening the door, he argues, to commercial quantum-computing development.

"I think it's very exciting as a researcher because we're really at the borderline between the two worlds of physics and engineering," says Matteo Mariantoni, lead author of "Implementing the Quantum von Neumann Architecture with Superconducting Circuits", published on Thursday in the online pre-publication service of Science magazine, ScienceExpress. (You can get 24-hour access to the article for $15.)

In his quantum computer, he says, computational steps take a few billionths of a second, which is about the same as you get with a classical computer. But unlike a classical computer, a quantum computer can handle a large number of these calculations simultaneously.

Matteo Mariantoni and his chilly quantum computer

Matteo Mariantoni and his quantum computer

As Mariantoni explains in a video provided by the University of California at Santa Barabara, where he is a postdoctoral fellow, the two central quantum phenomena upon which quantum computing are based are superposition and entanglement.

In a classical computer, a bit is either zero or one. In a quantum computer, on the other hand, the quantum superposition phenomena enables that bit to be simultaneously both zero and one. Or it can be any mixture of zero and one, such as 30 per cent zero and 70 per cent one.

Not only can each individual qubit be both zero and one or any mixture of the two, but two qubits together can take advantage of quantum entanglement. Mariantoni says that entanglement is "pretty much the same idea as the superposition principle. However, whereas the superposition principle works at the level of a single qubit, this entanglement is the same idea but between two qubits, so you can superimpose, you can mix together, two qubits."

Chillin' at the quantum level

Mariantoni's experiment, of course, doesn't involve only one particle per qubit. Instead, he said, his team used "millions or trillions" of particles to make a single qubit – what he referred to as "an immense sea of electrons that are coherently moving together." That "sea" on a single slice of silicon needed to be supercooled in order to remove the noise produced by energetic, warm, electrons before the quantum-mechanical coherence could be achieved.

That super-cooled chip became the site of Mariantoni's quantum-computing breakthrough. "We've created a quantum CPU," he explains, "where two qubits can interact with each other via a quantum bus – so-called quantum bus – allowing us to make a gate, quantum-entangling gate, and this is your central processing unit. We call it the quCPU."

Mariantoni and his team did more than simply create a quCPU – they engineered a quantum-computing system based on the von Neumann architecture, the scheme developed in the mid-1940s by John von Neumann and revealed in the famous "First Draft of a Report on the EDVAC", a seminal paper familiar to any CompSci student.

At it's core – and ludicrously simplified – the von Neumann architecture essentially describes the structure of every one of today's computers: a CPU/RAM system in which the memory feeds data and instructions to the CPU, and the CPU stores and retrieves the results of its computation in and out of RAM.

University of Santa Barbara's Quantum von Neumann Architecture illustration

The quantum von Neumann machine couples two qubits to a quantum bus, creating a "quCPU." Each qubit has its own quantum memory and a zeroing register, which constitute the "quRAM." (click to enlarge)

The same architecture can benefit a quantum computer. "Once you manage to create quantum information between your two qubits, for example, you want to store it somewhere to reuse it later on for some other computation," he says. "And so we have developed on the same chip the ability to create information and to store it, write it, exactly as in a classical hard disk, in a quantum memory."

In a quantum system, one reason the memory is so important is that it can allow information to hang aroud. "[Quantum memory] is long-lived," Mariantoni explains, "so basically it lives longer than the qubits, so we can store it there for pretty much long times, compared to the lifetime of the qubits, and when we need it later on, we can reread it from the memory, and bring it out and reuse it again for the next step of the quantum calculation."

It's time to open industry wallets

From Mariantoni's point of view, the ability to build a complete quantum von Neumann architecture system on a chip is a major step in moving quantum computing out of the lab and into industry – even if the system he and his team assembled is quite rudimentary and requires supercooling in order to function.

"I think at this point," he says, "being able to put together separate qubits, and quantum memories, and quantum zeroing registers, I think we should call ourselves quantum engineers. A couple of years ago, we were really quantum researchers."

He likens the state of today's quantum devices to where transistors were in the 1950s – the subjects of research studies, not engineering projects. But once the transistor matured, he says, industry took over, and classical computers arrived.

Quantum computing may still be far from being a viable commercial process, but Mariantoni argues that it's time to get going. "We can ... create something that is very close to a classical processor, and we can use it for implementing pretty complicated quantum softwares," he argues. "My feeling is that, at this stage – and it's really true – industries will be interested in investing money and effort in developing a large-scale quantum computer."

He may be right. With the increasing complexity and cost of shrinking silicon-transistor features in classical computing manufacturing, the possibility of commercial quantum computing is compelling – even if there remains a tremendous amount of work to be done by both researchers and engineers.

Just like there was in the years between Shockley, Bardeen, and Brattain's first demonstration of a transfer resistor in late 1947, and Texas Instrument's marketing of the first commercial silicon transistor in 1954. ®

Internet Security Threat Report 2014

More from The Register

next story
PEAK APPLE: iOS 8 is least popular Cupertino mobile OS in all of HUMAN HISTORY
'Nerd release' finally staggers past 50 per cent adoption
Tim Cook: The classic iPod HAD to DIE, and this is WHY
Apple, er, couldn’t get the parts for HDD models
Apple spent just ONE DOLLAR beefing up the latest iPad Air 2
New iPads look a lot like the old one. There's a reason for that
Google Glassholes are UNDATEABLE – HP exec
You need an emotional connection, says touchy-feely MD... We can do that
Lawyers mobilise angry mob against Apple over alleged 2011 Macbook Pro crapness
We suffered 'random bouts of graphical distortion' - fanbois
Caterham Seven 160 review: The Raspberry Pi of motoring
Back to driving's basics with a joyously legal high
Back to the ... drawing board: 'Hoverboard' will disappoint Marty McFly wannabes
Buzzing board (and some future apps) leave a lot to be desired
prev story


Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
How to simplify SSL certificate management
Simple steps to take control of SSL certificates across the enterprise, and recommendations centralizing certificate management throughout their lifecycle.