Skip to main content
SHARE
Blog

Staying ahead of Moore's law: ORNL researchers delve into quantum computing’s quandaries

The monitor shows a highly magnified image of a crystal tilted to align with the direction of the microscope's electron beam. The black-and-white spots on the screen are columns of atoms. Image credit: Jason Richards, ORNL

For all the power and complexity of today’s computers, they can still be boiled down to the binary basics—using a code of 1’s and 0’s to calculate and store information. Since the 1980s, though, some computer scientists have strayed from this simple language. They suggest that computers could speak a different dialect, one that taps into the world of quantum mechanics.

Quantum mechanics describes how very small particles such as atoms, nuclei and electrons behave and interact; it entails sharp departures from the phenomena found in classical physics. Take the physics of an electron’s spin, where it’s possible for the particle to be spin-up, spin-down or simultaneously both. This mind-bending property of superposition underlies the concept of quantum computing. 

“You can use that one piece of quantum mechanics, the superposition of states, to create a computer that can hold more information than a classical computer ever could,” said David Dean, director of ORNL’s Physics Division. 

The freaky physics of quantum mechanics morphs the binary “bit” of classical computing—either a 0 or 1—into a “qubit,” a bit that can take the form of a 0, 1, or both. Harnessing this power would enable computers to store much more information in a smaller area and run through certain kinds of calculations exponentially faster. 

Initially considered a theoretical exercise, quantum computing has progressed to proof-of-principle demonstrations of single- and multi-unit qubits, implemented physically through photons, electrons, quantum dots and other approaches. Yet the intriguing properties that give rise to quantum systems make them equally fragile. The introduction of an errant atom, a magnetic field or other phenomena can disrupt the balance in a quantum computing system; in fact, today’s state-of-the-art qubits last a matter of microseconds to milliseconds, depending on the physical type of qubit, before breaking down.

“Those kinds of material issues make it so that the qubit cannot live long enough to actually do any useful computation,” said Dean, who oversees a new lab-funded initiative aimed at addressing challenges in quantum computing. 

In one project, ORNL researchers seek to overcome the trouble caused by wayward atoms by using advanced microscopy techniques to detect and control single atoms in a semiconductor. Another effort focuses on understanding the refrigeration challenge: To avoid atomic vibrations that disrupt qubit operation, quantum experiments are usually cooled to near absolute zero. The refrigeration systems can introduce interference, however, so ORNL researchers are examining ways to control and balance the effects of temperature on qubits. 

Other research projects are directed at improving characterization, modeling and communications systems to enable practical applications of quantum computing systems. 

“The joke is that you need a quantum computer to program a quantum computer,” said ORNL researcher Travis Humble, who leads a project to develop a virtual testbed for silicon-based qubits. 

Quantum technologies aren’t expected to replace traditional computing architectures but rather to provide a quantum boost. Quantum accelerators, akin to graphics processing units, could be used to more efficiently solve certain types of problems—simulations of quantum mechanical systems such as chemical systems or quantum gravity, for example, or encryption and decryption calculations.

“I don’t see this as a one-or-the-other situation,” Humble said. “It’s much more likely that we will use quantum computing the way we use other methodologies, by integrating them into a larger computer.”

Scientists like Humble believe quantum research is key to sustaining the demand for increasingly faster, smaller and more powerful computing devices, given that transistors on traditional computer chips are fast approaching the limits of miniaturization.  

“We’ve always been able to beat Moore’s Law (which says the number of transistors on a computer chip doubles every two years), but we’re getting to the point where our current methods are likely to fail in the future,” Humble said. “As we decrease the feature size, we’re reaching the point where the physics models used to describe the behaviors of the features begin to break down. This is where the quantum aspect begins to become very real. 

“The question,” Humble said, “is what to do we do in the face of that challenge? We need to embrace the quantum effects and make something out of them.”