What are quantum computers good for?
Forget cracking crypto, think modelling reality itself to help build a better one
The problem with trying to explain quantum computing to the public is that you end up either simplifying the story so far as to make it wrong, or running down so many metaphorical rabbit-burrows that you end up wrong.
So The Register is going to try and invert the usual approach, and try to describe quantum computing at a more materialistic level: how do you build one, and when it’s built, how do you use it? Hopefully, a concrete framework will make it easier to understand quantum computing along the way.
And we promise not to reiterate the story of Schroedinger ‘s cat. Not even once.
The basic design
Actually, a quantum computer is very easy to design. Its main systems are below.
How to build a quantum computer. Image, Andy Davies, The Register
The classical computer is something we're all familiar with. The main point of having it there is because you – as the human trying to use the quantum computer – need something to interact with. We haven’t quite got so far as a nice GUI interface for what’s inside Schroedinger’s box.
The quantum computer is where the qubits live – those difficult creatures that, if we can control them properly, actually perform the quantum computation.
That leaves us the bit in the middle – the quantum-classical interface. It’s the cutoff between the quantum world and the classical world, which is why researchers call it the “Heisenberg cut”. It’s the place where classical information emerges out of quantum mechanics.
When it comes to describing quantum computing, that interface can be both easy and difficult to explain.
The mechanism for manipulating a qubit can be difficult to do (that’s what this year’s Nobel Prize in Physics was all about!) but it isn’t that hard to understand: it may be as “simple” as manipulating a mirror or a polarising filter. But just how “classicality” emerges from the quantum world occupies a lot of theoretical attention – the kind that needs long and deep study, and …
To be honest, it doesn’t matter all that much for this discussion. We know that classical emerges out of the quantum, because we’re here to argue about it. We can see that qubits behave like quantum phenomena: they exhibit entanglement and superposition, and those are the kinds of properties that give us quantum computers.
Next: The compiler
There’s another reason to have a classical computer act as the intermediary between the user and the quantum computer: it makes the quantum computer more flexible, because it will carry out a step analogous to a compiler in conventional computing.
The compiler’s job is to take the user’s program instructions, and determine things like the algorithm the program will use, and which quantum operators – qubit gates – are needed to execute the algorithm.
Constant vs balanced?
To make quantum computers useful, we need to understand when they will clearly outperform classical algorithms – which is hard to explain convincingly, but lets try a simple example.
Let’s imagine a (rather pointless) guessing game: Alice has two machines in front of her, comprising a keypad and a display that can show 0 or 1, and it’s her job to figure out which one is working correctly. The machines will accept any input between 1 and 1000 (decimal, not binary), and should display a 0 for 500 of the possible inputs, and a 1 for the other 500 – except that Alice doesn’t know which inputs produce which results. All she is told is this:
“The faulty machine only displays 1, never 0”.
The question is: is there an efficient way to test the machines? Alice could simply grab one of the machines, and start punching in numbers. If Alice chose the working machine, and if the association between numbers and 0 or 1 outputs is randomly distributed, she could find an answer fairly quickly that way.
If she’s lucky.
If she’s unlucky, Alice might have to test 500 numbers on one machine before she sees the output change from 0 to 1 (or vice versa). Similarly, if she’s unlucky, she chose the broken machine, and she will have to test 500 inputs to be certain.
Mathematically, what Alice is trying to do is work out whether the unknown function – the one inside the working machine – is constant or balanced. Computers can work this out – but the best known classical algorithm still gives us a worst-case complexity in which the required tests are always half the possible number of inputs. If the machines accepted numbers between 1 and 2000, then 1000 tests are needed in the worst case.
In one of the earliest breakthroughs in quantum computing, Deutsch and Jozsa devised a simple quantum algorithm for solving this problem. They discovered that if you had a quantum version these gadgets, one which could have quantum bits as inputs and outputs, then this problem could be solved by a single query.
The key to this quantum advantage is that much of the work that Alice would have performed can be taken care of by quantum superpositions and interference.
This algorithm is initialized by setting the input qubits and output qubit to 000…0.
Then we apply some quantum gates that create a superposition over all possible inputs to the gadget. There is then a circuit element that feeds these inputs into the unknown gadget. Finally, the same operation that created the superposition over all inputs is performed once again and all the qubits are measured.
Before we perform this measurement, the computation is in Schroedinger’s Box (to borrow a metaphor) and qubits exist in a superposition of 0’s and 1’s. When we “open the box” and destroy the superposition we will simply read out a string of 1’s and 0’s.
Now, because of the way in which this circuit is implemented, if the gadget was defective, that is, constant, then it will always give the answer 000…0 with 100 percent. If the gadget was balanced – that is, not defective – then some of those output bits will definitely give the answer 1. For the working machine, there is 0 percent chance that we will read out a string of 0’s.
Next: getting more complex
It’s more complex than that, surely?
Of course – but this is, at least, an accurate enough framework to act as a starting point. For instance, more complex quantum algorithms are robust to noise. In the case of the Deutsch-Jozsa algorithm, if a small chance of failure exists then there is a classical algorithm that performs just as well as Deutsch-Jozsa.
Quantum circuits create finely tuned probability distributions. Any noise makes our measurement that little bit uncertain: did we measure qubits superposed with the desired state, or did we merely observe noise? The solution to this in more sophisticated algorithms, like Shor’s famous factoring algorithm, is to repeat the process, and build up a distribution of the results, increasing the probability that we are observing an answer that is correct.
In other words, the output register (where we read the qubits) becomes a probabilistic summary of all the possible output states permitted by the algorithm you’re using.
Of course, it’s quite possible to generate probability distributions with a classical computer – but as your number of possible outcomes increases, so does the length of time needed to generate the distributions. But at some point, as it seems for many problems in the famous complexity class NP and for many of the problems that quantum computers are good for, a classical Turing machine can’t get there in polynomial time.
One of the reasons so much research is directed to how we handle and measure qubits is to give us more confidence in fewer measurements – without destroying the quantum states that we need for the quantum computer to work.
That is, of course, if we’re positive that quantum mechanics is indeed complete. That’s the deepest theoretical question of quantum mechanics, a question that’s almost metaphysical.
Rather than seeking an ultimate, final, indisputable proof that quantum mechanics is right, it’s probably easier to do what we’re doing: continue the research to reach a point at which we can test a quantum computer of reasonable scale, and let the proof come along later. After all, we happily use quantum mechanics to build lasers!
Next: Applications of quantum computing
Applications of quantum computing
Why bother? Worldwide, a lot of research dollars are being poured into quantum computing, in spite of a widespread belief that it’s only application is to render today’s encryption algorithms useless.
What’s the point of spending research dollars on a theoretical construct of such limited application?
The idea that quantum computing is only good for large-number factoring is still widespread. But some of the standard tools of quantum computing are starting to chip away at this perception. Let’s look at two.
The Fourier Transform is one of the oldest: while the mathematics for representing a time-domain event in the frequency domain is well-understood, the more complex the wave you’re trying to analyse, the bigger the calculation. In certain settings a quantum computer is exponentially more efficient at performing Fourier Transforms than a classical computer.
This is very much a creature of the real world, spanning materials sciences, structural engineering, our understanding of waves, photonics, consumer electronics … the list goes on.
Another example is here : a quantum algorithm for solving linear equations (where you have a matrix and a known vector, and wish to compute an unknown vector). For some classes of linear equations, Harrow, Hassidim and Lloyd have demonstrated that a classical computer’s runtime will be exponentially greater than that of a quantum computer.
As we understand more about quantum algorithms, we’re learning more about the types of problems that quantum computers could be applied to. And as researcher Scott Aaronson describes in this  blog post, it’s also becoming clear that one of the best applications for quantum computers is to help us simulate the quantum world. This post describes the limits of classical simulation, with the conjecture that even quantum computers based on linear optics are too complex to be simulated in a classical computer.
Simulating quantum systems is an application that today consumes a huge amount of the world’s supercomputer processor cycles – along with other big-science headliners like astrophysics, genetics, geoscience, materials science, meteorology, climate modelling and high-energy physics.
Why is so much effort expended on quantum simulation? Because that’s where the world begins. Quantum simulation helps us, for example, better understand what’s going on in the world of chemistry. That, in turn, leads to potential applications in materials science, genetics, and other disciplines.
In this  Nature paper published last year, for example, European and Canadian researchers propose using quantum computers to (it sounds recursive) simulate the states available to … a quantum computer.
It’s neither silly nor trivial. As they explain in the abstract, the simulation is necessary so as to understand the equilibrium and static properties of the quantum system. But how many samples are needed for a researcher to be certain that they’ve fully characterized the quantum system – without making a “list of everything”?
Today, the problem is approached by sampling, using the Metropolis algorithm on a classical computer. The European/Canadian group propose an alteration to that algorithm that uses a quantum computer to obtain the samples.
If we can build them, quantum computers will help get the quantum universe to yield up more of its secrets – and the best way to prove we can build one is to build one! ®
The Register would like to acknowledge the generous assistance and collaboration of Associate Professor Michael Bremner of the Centre for Quantum Computation and Intelligent Systems at the University of Technology, Sydney, in preparing this article. All errors or omissions, however, are the author's.