ORNL
  
   
Research Horizons

Quantum advantage

Small-scale phenomena drive next-generation computing technology

From smartphones to supercomputers, users of digital devices are united in their need for speed. For decades, that has meant creating silicon-based computer chips that pack more circuitry into increasingly smaller spaces. Recently, however, the recognition that the density of circuitry on silicon chips is nearing its theoretical limit has accelerated research in the field of quantum computing.

Although still in its infancy, quantum computing has demonstrated the potential to handle certain types of computational problems many times faster than traditional methods. The key to this advantage is its reliance on esoteric physical interactions that occur among particles like photons and electrons on a vanishingly small scale.

Creating quantum entanglement

ORNL postdoctoral researcher Ben
Lawrie operates the controls at the
laser table. Photo Jason Richards
ORNL postdoctoral researcher Ben Lawrie operates the controls at the laser table. Photo Jason Richards

To generate these quantum interactions in the laboratory, ORNL quantum scientist Raphael Pooser fires a laser into a cloud of vaporized rubidium atoms. The laser beam, like all light, is composed of individual photons. Under tightly controlled conditions, the atomic vapor acts like a nonlinear crystal, splitting the laser into twin beams of light, each with a frequency slightly different from that of the original. Pairs of photons—one in each beam—are created by this process. Each member of the pair is linked  to the other by a phenomenon known as quantum entanglement. “These entangled pairs are
the fundamental building blocks of quantum optics,” Pooser says. “They enable us to do quantum computing with photons.”

The notion of quantum entanglement is difficult for most people to take in. Even Albert Einstein wasn’t a big fan of the concept. In a paper published in 1935, Einstein and his colleagues disputed the completeness of the new theory of quantum mechanics because they claimed it failed to define all elements of reality accurately. Among their misgivings were reservations about the implications of
equations indicating that measuring the position or momentum of one quantum object (such as a photon) allows one to calculate the position or momentum of another—regardless of the distance between the two. Einstein skeptically called this phenomenon ”spooky action at a distance.” “We call it ‘entanglement,’” Pooser says.

Encoding quantum information

Pooser explains that the “spooky” relationship between the entangled photons in terms of position and momentum extends to their polarization as well. In classical computing, the distinction between horizontally and vertically polarized light could be used to express a unit of digital information known as a binary digit, or “bit,” represented by a zero or a one. In quantum computing, however, a quantum bit, or qubit, is much more versatile. With vertical polarization, the value of the qubit is zero; horizontal polarization yields a one. “What is interesting,” Pooser explains, “is that the bit can have not only horizontal or vertical values, but both, and any intervening values, at the same time. That is the probabilistic nature of quantum mechanics.” This ability to express simultaneous or superimposed values illustrates the advantage quantum computing has over classical computational methods. “Superimposition means we can do calculations for all possible values of the qubit at once,” Pooser says. “That’s what we call massive parallelization.”

The challenge for Pooser is determining how to extract all of the answers. “If we can’t access all of the results at the same time, what good is this massive parallelization?” he asks. That’s where entanglement comes in.

The difficulty in extracting multiple results arises as a result of the inherent uncertainty of quantum measurements. An oft-cited example of this ambiguity is offered by one of Einstein’s contemporaries, the physicist Werner Heisenberg. Heisenberg’s uncertainty principle states that it is impossible to measure both the position and the velocity of a quantum object at the same time. Similarly, researchers cannot read all possible values of a qubit at the same time. However, as noted above, entanglement makes it possible to make a measurement on one entangled object and then to infer the value of the same measurement for its entangled partner. As a result, a pair of entangled photons enables simple calculations to be made. As Pooser and his colleagues succeed in entangling greater numbers of photons, they will use them to address progressively more complex computational problems.

Quantum calculations

Quantum calculations are made by the optical equivalent of algorithms—specialized mathematical equations. Pooser uses an apparatus called a laser table to direct beams of entangled photons through a network of optical components, like beam splitters, lenses and filters, to create these optical algorithms. These components enable the photons to interact with one another so that, when they complete the circuit of the table, the results of the equation are encoded in their final optical characteristics.

“Most quantum algorithms use entangle ment to get results,” Pooser says. One of the most highly touted applications for this technology is finding large prime numbers—numbers that can be evenly divided only by themselves and 1. Large prime numbers are of interest in computing circles because they are used to create, and decode, encrypted data. “It turns out that quantum computing is particularly good at that,” he says.

As computational requirements become more  complex, the number of optical components required to reach a solution increases as well. This and other practical considerations are fueling a drive toward miniaturization of these optical circuits. Most of Pooser’s computational research has been conducted at the macro scale—on a laser table.

However, he and his colleague, quantum information scientist Phil Evans, have made considerable progress in scaling down these optical algorithms to the point that a tabletop experiment can be recreated on an optical chip the size of a fingernail. These chips have microscopic waveguides etched into their surfaces to channel laser light as it undergoes optical permutations that mirror those used on the full-sized laser table.

Quantum simulations

One of the primary applications of complex quantum calculations is expected to be creating quantum simulations. Pooser explains that programmers can write routines for classical computers that simulate

quantum systems, but they will never be exactly right because classical computers are designed to operate under the physical laws of classical physics, not quantum physics. “If we really want to simulate a quantum system accurately,” he says, “we need to build a computer that obeys the laws of quantum mechanics. For example, if I build a quantum mechanical simulation that demonstrates high-temperature superconductivity, the fact that I could build the simulation proves that the system could exist in nature. This is one of the ways a quantum simulator would provide insight into the quantum world in ways classical computers can’t.”

The potential for eventually producing quantum simulations has been a big motivation for developing quantum computing. Pooser expects that the area will continue to be of interest to the Department of Energy because scientists are finding that quantum mechanical processes are intertwined with critical energy technologies. “For example,” he says, “photosynthesis involves quantum entanglement, and generating solar energy depends on the photoelectric effect, which is also a quantum mechanical process. If we want to conduct the most  accurate possible analyses of systems like these, we’ll need a quantum computer.”

Technology of tomorrow

Pooser says he and his colleagues in ORNL’s Cyberspace Science and Information Intelligence Research group are on the cutting edge of quantum optic science. “Only one or two other groups in the world have
the capability to do this kind of research,” he says. “This is the technology of tomorrow.”

Looking ahead, Pooser estimates that, in five years, complex calculations will be carried out on optical chips—or a series of chips. In 10 years, he hopes to be able to consolidate complex algorithm processing on a single chip. He suggests that the timeline for quantum computing’s becoming a viable analytical platform could be anywhere from 15 to 20 years down the road if there is a big breakthrough, longer if there’s not. “The world is waiting for this,” he says. “There’s no question that it will happen. It’s just a question of when.”— Jim Pearce

Universe