Why quantum computers have a hype problem

As a buzzword, "quantum computing" probably ranks just behind AI in terms of hype. Large technology companies such as Alphabet, Amazon and Microsoft are now doing extensive research and development work in the field of quantum computers. A number of startups have also sprung up, some boasting staggering valuations. IonQ, for example, was valued at just over $2 billion by a takeover company when it went public in October. Many of these activities have developed at an amazing speed over the past three years.
I am an absolute supporter of quantum computing as much as one can be: I have published more than 100 technical papers on the subject and many of my PhD students and postdocs are now world-renowned quantum computing experts. But I'm still concerned about the hype surrounding quantum computing these days – especially when it comes to how it's going to be commercialized.
There are already established applications for quantum computers. Best known is Peter Shor's theoretical proof from 1994 that a quantum computer can solve the difficult problem of finding the prime factors of large numbers exponentially faster than all classical methods. Prime factorization is at the heart of commonly used RSA-based cryptography, so Shor's factorization scheme immediately caught the attention of governments around the world and led to significant funding for quantum computing research.
We can build it – but what does the quantum computer do?
The problem: to build a quantum computer that can do this at all. That depends on the implementation of an idea developed by Shor and others called quantum error correction. This is a process that compensates for the fact that quantum states quickly disappear due to environmental noise (a phenomenon called "decoherence"). In 1994, scientists thought that such error correction would be easy to do because physics allowed it. In practice, however, it is extremely difficult.
The most advanced quantum computers today have dozens of decoherent (or "noisy") physical qubits. However, building a quantum computer that could crack the RSA encryption from such components would require many millions, if not billions, of qubits. Only a few tens of thousands of these would be used for calculations—so-called logical qubits; the rest would be needed for error correction to compensate for decoherence.
The qubit systems we have today are a tremendous scientific achievement, but they don't get us any closer to a quantum computer that can solve problems that anyone cares about. It's like trying to build today's best smartphones with vacuum tubes from the early 19th century. You can stick 100 tubes together and claim that if you could somehow get 10 billion of them to work together coherently and seamlessly, you could do all sorts of miracles. However, what is missing is the breakthrough of integrated circuits and CPUs, which then led to smartphones. It has taken 60 years of very hard engineering to go from the invention of transistors to smartphones without using new physical processes.

Smart physicists

There are indeed ideas – and I have played some role in developing the theories for those ideas myself – for circumventing quantum error correction by using far more stable qubits in an approach known as topological quantum computing. Microsoft is working on this idea. However, it turns out that the development of topological quantum computing hardware is also a major challenge. It is unclear whether comprehensive quantum error correction or topological quantum computing (or something else, e.g. a mixture of both) will ultimately be the winner.
Physicists, as we all know, are smart (I'm a physicist) – and some physicists are also very good at coming up with nice-sounding acronyms that stick in the mind. The great difficulty of getting rid of decoherence has led to the impressive acronym NISQ for "noisy intermediate scale quantum" computers – the idea that small collections of noisy physical qubits could do something useful and better than a classical computer. I'm not sure what that means: how noisy? How many qubits? Why is it a computer? What valuable problems can such a NISQ engine solve?
The Technology Review focus on "Quantum Technology":
In a recent lab experiment at Google, some predicted aspects of quantum dynamics (referred to as "time crystals") were observed using 20 noisy superconducting qubits. The experiment was an impressive demonstration of electronic control techniques, but it showed no computational advantage over conventional computers, which can easily simulate time crystals with a similar number of virtual qubits. It also didn't reveal anything about the basic physics of time crystals. Another success of NISQ are recent experiments to simulate random quantum circuits, also a highly specialized task without any commercial value.

Novel basic research

Using NISQ is certainly an excellent new idea for basic research – it could support physics research in fundamental areas like quantum dynamics. But despite the constant hype surrounding NISQ, emanating from various quantum computing startups, the potential for commercialization is far from clear. I've read vague claims about how NISQ could be used for rapid optimization or even AI training. I'm not an expert in optimization or artificial intelligence, but I asked the experts and they are equally clueless. I've asked researchers involved in various startups how NISQ could optimize some difficult task with real-world applications, and I interpret their convoluted answers as basically saying that we don't quite understand how classic machine learning and AI really work, so it's possible that NISQ could do it even faster. Maybe, but that's a hope, not an actionable technology.
There are proposals to use small quantum computers in drug development to quickly calculate molecular structure, which is an intriguing application considering that this quantum chemistry is only a tiny part of the whole process. Equally confusing are the claims that quantum computers will help in finance in the near future. No technical publication convincingly demonstrates that small quantum computers, let alone NISQ machines, will lead to significant optimization of algorithmic trading, risk assessment, arbitrage, hedging, targeting, forecasting, and trading of assets per se or creation of risk profiles could result. However, that hasn't stopped several investment banks from jumping on the quantum computing bandwagon.
A true quantum computer will have applications unimaginable today, just as nobody could have foreseen in 1947 when the first transistor was made, how it would eventually lead to smartphones and laptops. I'm very hopeful and a firm believer in quantum computing as a potentially game-changing technology, but the claim that it will bring millions of dollars in profits to real companies selling services or products in the near future baffles me.

Like a jumbo jet

Quantum computing is indeed one of the most important developments not only in physics but in all of science. But "entanglement" and "superposition" aren't magic wands that we expect to transform technology in the near future. Quantum mechanics is indeed strange and counterintuitive, but that alone is no guarantee of a startup's sales and profits.
More than a decade ago, I was often asked when I thought a real quantum computer would be built. (Interestingly, I don't get asked that question again today, as the hype surrounding quantum computers seems to have convinced people that these systems already exist or are about to be completed.) My unequivocal answer has always been: I don't know. It's impossible to predict the future of technology – it will happen when it happens. One could try to draw an analogy with the past.
It took the airline industry more than 60 years to progress from the Wright brothers to jumbo jets carrying hundreds of passengers thousands of miles. The immediate question is where the development of quantum computing in its present form fits on this timeline. With the Wright brothers in 1903? With the first jet aircraft around 1940? Or are we perhaps still far back in the early 16th century, with Leonardo da Vinci's flying machine? I dont know. And nobody else does either.
Sankar Das Sarma is Director of the Condensed Matter Theory Center at the University of Maryland, College Park.

Related Posts

Leave a Reply

%d bloggers like this: