Quantum computing promises to be a key development in the next decade. It’s not magic and nor is it earth-shattering, despite the amount of spilled ink and breathless documentaries on the subject.
Fifty years ago, Gordon Moore observed that the density of integrated circuits double every two years. Since then, chip speeds and densities have improved to allow the cost of computing to drop by half every eighteen months. We’re approaching the limit of the current chip making technology. Moore’s Law is only an observation, not a law in any definitional sense. There’s no guarantee this trend continues. Nonetheless, quantum computing offers us hope that it will.
The computers and phones we use today are based on binary logic. We call these devices classic computers. A voltage on a wire or in a cell of a memory chip is a one; no voltage is a zero. We call a single data element a “bit”, short for “binary digit.” At any time, you can attach a meter or a bulb to see if any particular bit is on or off, one or zero. You may have to watch quickly – it may be there for a billionth of a second – but it is always one or the other. You’ll need a small probe, too. Today’s memory chips are already quite small. A single bit is 50 nm x 50 nm (50 billionths of a meter). If one can shrink the wire or memory enough, one can take advantage of a phenomenon in quantum mechanics that allows the bit to take on multiple values at the same time. I’m not going to explain it here, but you can understand it if you Google Schrödinger’s Cat. A quantum computer uses a different kind of bit, a qubit, short for quantum bit. While the machine is computing, a qubit can have any value between zero and one, including complex numbers. A qubit can be stored on a single atom or a single photon. One method of building a quantum computing chip is to place atoms of niobium in a lattice of silicon. A niobium atom is 285 pm in diameter (285 trillionths of a meter). A quantum computer offers us the possibility of shrinking memory size by a factor of 200. Smaller size means less energy and faster computation. That’s a hell of an improvement, but it’s not earth shattering.
To do computing on a single atom or photon takes a quiet environment. So does a computer or phone. Your desktop PC has a metal cabinet called a Faraday Cage to protect stray signals from entering or leaving the box. Your cell phone has a layer of very expensive conductive paint inside to do the same. The single atoms or photons in a quantum computer require not only protection from the outside, but from signals inside the box. To do that requires that the chip be in a cold environment, so that heat energy doesn’t disturb the qubit. D-Wave, manufacturer of qubit computers, cools the chip to 15 mK (15 millikelvins, 1/180 the temperature of outer space).
Cooling a computer is nothing new. Big IBM mainframes and Cray supercomputers both used massive cooling systems. What is different in quantum computing is the low temperature that promotes superconductivity.
After you’ve built a cell small enough and cold enough to allow a qubit to take on all the values between zero and one simultaneously, a phenomenon called superposition, all kinds of computations can be accommodated. One is factoring large numbers, allowing one to crack RSA encrypted data. That would make most of today’s encryption schemes obsolete, including the methods we use to secure credit card data.
As long as Schrödinger’s quantum cat remains locked in its box, it is at once alive and dead. When you open the box and look, you’ll only see one outcome. We say that the quantum state collapses. When it does, it reveals one and only one classical state. In the case of the cat, it is dead or alive. In the case of a quantum computer, the qubit data collapses to a one or zero, just like every other computer today. There is a certain amount of randomness in that collapse. Programming for a quantum computer provides that the algorithm be repeated many times. The results that comes most frequently is the right answer. The efficiency of the qubit computing far outweighs the cost of repeating the computation – or at least, that’s the hope.
The qubit is not as radical as it sounds. Before the advent of the classical digital computer was the analog computer. A single wire on an analog computer could have multiple values of current and voltage. The disadvantage of the analog computer was not the accuracy of the results, but the need to rewire it for every problem. Today we load a program to a digital computer to solve a new problem. We don’t have to rewire it.
Magic or just sleight of hand?
What is magical about the quantum computer is entanglement – what Einstein called “spooky action at a distance.” Tie two photons or two electrons together then separate them and they will act like mirror-image identical twins no matter how long or how far they are separated. Measure the spin of one electron and then measure the spin of its mate; you always get the opposite answer. Experimentation has shown the effect to work at 10,000 times the speed of light. You cannot use entanglement to communicate new information faster than the speed of light, because the result of the measurement of the first particle is random. With certainty, the measurement of the second will produce the opposite result. Communication of new information would require that you could predict or force the first measurement to produce a specific result. With entanglement, you get predictable results from two measurements almost instantaneously, but you aren’t transferring new information, the definition of communication. Entanglement allows information about the state of a qubit to be transferred to another almost instantaneously. Having entangled particles in a quantum computer is what gives it its oomph. Entanglement is hard to observe in a computer, the same way two cats are impossible to observe in Schrödinger’s box, or knowing whether the light glows in a closed refrigerator. The act of observing it changes it. Detractors say D-Wave’s claim of entanglement is magic – in the sense of marketing sleight of hand.
D-Wave claims potential computing speeds 108 times faster than a classical computer because of entanglement. To achieve that requires that all of the qubits in the machine be entangled. One cannot observe this entanglement directly, but only by observing the performance of the machine executing a program. There are formulae that predict performance for both quantum and classical computers. Testing by third-party academics in 2015 shows that D-Wave computers programmed to use only two qubits follows the formula predicted for entangled qubits in a quantum computer (Albash, Hen, Spedalieri, & Lidar, 2015). Increase the number of qubits required by the test programs to eight and the results show evidence of entanglement but less strongly. When other investigators have looked at D-Wave using twenty or one hundred qubits, the results look like a classical computer. In short, the D-Wave computer is exhibiting quantum effects, but it’s not firing on all quantum cylinders.
D-Wave has been building computers with more and more cylinders (qubits) every year.
- 2007 – Orion 15 qubits
- 2011 – D-Wave One 128 qubits
- 2013 – D-Wave Two 512 qubits
- 2015 – D-Wave 2X 1152 qubits
- 2017 – D-Wave 2000Q 2048 qubits
IBM, long the leader in mainframe computing, jumped into quantum computing in 2016, announcing a five-qubit machine as part of a $3 billion investment in the field. In May of 2017, it announced two new machines of 16 and 17 qubits respectively. The approach is different from D-Wave, but apparently successful – successful enough that it has invited the public to try it out over the web.
Moore’s prediction fulfilled
There’s a three-way race to fulfill Moore’s prediction that computer performance will continue to double every eighteen months. Silicon chipmakers will continue to make improvements in classical computers. They’ve been working that mine for fifty years. Intel expects only five more years of improvements. It is dipping its toes into other quantum technologies: tunneling transistors and spintronic memory.
D-Wave has been in the quantum computing game for ten years. Whether it gains enough speed to grab the baton in five years is anyone’s guess. My money is on IBM. With a $3 billion commitment and a century of experience in research, development, and manufacturing, it is uniquely positioned to repeat its wins with the mainframe and personal computer.
Albash, T., Hen, I., Spedalieri, F., & Lidar, D. (2015, Dec 14). Reexamination of the evidence for entanglement in a quantum annealer. Physical Review A. doi:10.1103/PhysRevA.92.062328