Quantum computing is an interesting topic. It is interesting becuse it can lead to key developments in many fields. It is also interesting because there is a huge deal of misunderstanding surrounding quantum computers. My aim is not to tell you how they work. There are people doing that much better than I could, but I think a more general background is required before diving into the details of how quantum computers work.
In this post, we will first observe what historical problems made quantum computers necessary, what developments turned people’s attention towards quantum computing, and where it can possibly go. For the more technical parts, I will either link to, or provide basic introductions in the form of footnotes.1 You can directly click on them to go back and forth.
This is the first post in a two-post series on quantum computing. A link to the next post will magically appear at the end of this sentence when it is published.
Let’s start with the early days.
Benioff, Feynman, and Deutsch
Like any other scientific endeavor, it is hard to single out an individual as the “inventor of the quantum computer,” yet we can look back in time to those people who pioneered this unlikely marriage of quantum mechanics and computer science. Three physicists stand out in the early days, when quantum computing was no more than an idea: Paul Benioff, Richard Feynman, and David Deutsch.
In the year 1981, at Argonne National Labs, Paul Benioff was the first to propose a quantum mechanical model of computing, i.e. a Turing machine that operated on quantum mechanical principles.
His goal was to show that a quantum mechanical model of computation was not impossible, despite certain problems. For simplicity, he focused on a Turing machine that ran on a finite type. It also had to maintain a history tape to provide reversibility.
He was successful (given that we are talking about him 40 years later), but he had doubts about the possible use cases of such a machine:
…as far as the uses made so far are concerned, it probably makes no difference whether a computer is described classically or quantum mechanically.
Yet right in the next paragraph, he alludes to a key possible use case for quantum computers that brings us to our next physicist:
Consider, for example, the fact that one tests the validity of quantum mechanics by carrying out repetitions of an experiment and comparing the [result] with a theoretical expectation calculated on a computer.
No physics discussion is complete without Richard Feynman. His goal in this case was to simulate physics with computers, but he was not satisfied with an approximate simulation. He wanted an exact simulation that did not require gazillions of years to perform. He stated this problem in his 1981 speech appropriately titled “Simulating Physics with Computers.”
Feynman was also the first to coin the term quantum computer. Before considering the possibility of such a simulation on classical computers, he asked if it could be done with quantum computers.
I therefore believe it’s true that with a suitable class of quantum machines you could imitate any quantum system, including the physical world.
He left it as an open problem to define a universal quantum computer, similar to a universal Turing machine, that can simulate any quantum system:
It has been found that there is a kind of universal computer that can do anything, and it doesn’t make much difference specifically how it’s designed. […] What, in other words, is the universal quantum simulator?
So the field was open. The idea of a quantum computer that could be used to simulate any physical system was there. It would take an eccentric physicist and three years to take it from a possibility to a precisely defined reality.
In his paper “Quantum theory, the Church-Turing principle and the universal quantum computer”, regarded as the epoch of quantum computing, David Deutsch showed that a universal quantum computer was in fact possible.
Deutsch, a physicist himself, was inspired by a conversation he had with Charles Bennett of IBM on the Church-Turing thesis, which says it is possible to create a machine that can compute any computable sequence. Deutsch defined a more general, physical version of the Church-Turing thesis, dubbed the Church-Turing-Deutsch principle:
Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means.
Classical physics and regular Turing machines did not obey this principle, so he proposed the “universal quantum computer”. A quantum mechanical machine would be able to simulate any other quantum mechanical system, hence the entire physical world. He was more interested in such a machine’s implications on quantum mechanics rather than actually building one though.
If such a machine was possible, then where did the extra computing power come from? He believed this quantum parallelism “placed an intolerable strain” on all interpretations of quantum mechanics except the many-worlds interpretation of Hugh Everett, also known as parallel universes interpretation.2 By extending the jargon, and making quantum physicists reading this die a little bit inside, we can say that when a quantum computer performs a computation, the result is computed in different universes at the same time. Let that sink in.
So the year is still 1985, and we have a model for building quantum computers to simulate quantum mechanics. Is this all? Why all the hype surrounding quantum computers today if all we could do was simulations? Turns out there is more than that.
15 = ? x ?
Simulating quantum mechanics is no simple task, but quantum computers still needed a “killer app” that would bring them more attention, and therefore funding. That application came in the form of an algorithm discovered by Peter Shor of Bell Labs in 1994. It was an algorithm, aptly named Shor’s algorithm, to calculate the prime factors of integers.
There are two reasons why an integer factorization algorithm was such a big deal:
- It is practically impossible to factor large integers with classical computers. The revelation that quantum computers could do something other than simulation that classical computers couldn’t, meant there could be many more applications.
- One of the most widely used cryptosystems, the RSA cryptosystem, inherits its security from the hardness of integer factorization. It would be easy to break RSA if one could factor integers in a reasonable time. This is also likely the reason why NSA is trying to build one.
It is worthwhile mentioning the “quantum computers will break all encryption and the world will collapse” narrative which seems to be widespread in the general public. It is wrong from a few aspects:
- The quantum computers we have right now are just not powerful enough to factor numbers as large as those used in RSA. The largest number to be factored with Shor’s algorithm remains 21. Even a medium-secure 1024-bit RSA key is much, much larger than that.
- There is a whole branch of cryptography called post-quantum cryptography with people trying to find algorithms, (mostly public-key algorithms since symmetric algorithms and hash functions seem to be secure against quantum computers) that stand strong against quantum computers. Considering the above point, they still have a lot of time.
The one possible problem is migrating all the systems from RSA to new cryptosystems if it is still widely used when quantum computers become strong enough. Despite that, it is too early to call for a crisis because quantum computers “break” all modern cryptography.
It is not right to consider quantum computers as pure code-breaking machines though. The problem that prompted Feynman to propose an idea of quantum computers was simulating quantum mechanics. Even though it may seem like a niche application compared to integer factorization, accurate quantum simulation can pave the way for new breakthrougs in many areas. Nanotechnology, material science, and medicine are just a few examples that come to mind.
So is it just a faster computer?
A consequence of Shor’s discovery was that people, and the media, started to think quantum computers could magically solve all the problems classical computers couldn’t. SAT, traveling salesman, the meaning of life…anything! That is simply not the case. Let’s see what complexity theory has to say about that.3
The BQP (bounded-error quantum polynomial-time) class covers problems that are solvable by a quantum computer in polynomial time, with at most 1/3 chance of error. We know that problems in P are also in BQP. That is to be expected considering quantum computing is a generalization of classical computing, and a quantum computer can be programmed to behave like a classical computer. It is BQP’s relation with NP that interests us.
So far, we know for sure that NP is not a subset of BQP, so quantum computers won’t magically solve all problems requiring exponential resources on classical computers. Furthermore, it is deemed unlikely that BQP contains NP-complete problems. So BQP contains some problems from NP, and possibly some from outside it, but the boundary between BQP and NP is still an open problem.
An important reason why there is such misunderstanding around quantum computers is that up to this point, pretty much all improvements in computers were of the same nature. Computers kept getting faster and faster, but they were essentially the same. Processors had more transistors, screens had more pixels. You could do the same things, but faster. Quantum computers don’t fit this model. They are not just faster computers. They are faster computers, but they are also more than that. They can do things classical computers, no matter how fast, can’t.
As a final note, think of the connection between classical mechanics and quantum mechanics. Physicists know that classical mechanics is wrong, and quantum mechanics is right, yet classical mechanics is perfectly fine to model stuff we interact with in our daily lives. Likely, quantum computers are generalized versions of classical computers, but that won’t mean classical computers will become obsolete in a few years. They will remain more than enough for our practical purposes.
Further Reading & Resources
If you want to learn more about quantum computers and how they work, this essay by Michael Nielsen and Andy Matuschak is one of the best.
The same Michael Nielsen also has a YouTube series on quantum computers.
Benioff, P. (1980). The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines. Journal of Statistical Physics, 22(5), 563–591. https://doi.org/10.1007/BF01011339
Feynman, R. P. (1982). Simulating physics with computers. International Journal of Theoretical Physics, 21(6–7), 467–488. https://doi.org/10.1007/BF02650179
Deutsch, D. (1985). Quantum theory, the Church-Turing principle and the universal quantum computer. Proceedings of the Royal Soci- Ety of London A: Mathematical, Physical and Engineering Sciences, 400(1818), 97–117. https://doi.org/10.1098/rspa.1985.0070
Shor, P.W. (1994). Algorithms for quantum computation: discrete logarithms and factoring. Proceedings 35th Annual Symposium on Foundations of Computer Science. IEEE Comput. Soc. Press: 124–134. doi:10.1109/sfcs.1994.365700.
Aaronson, S. (2010). BQP and the polynomial hierarchy. Proceedings of the Annual ACM Symposium on Theory of Computing, 141–150. https://doi.org/10.1145/1806689.1806711
Image credit: https://www.smbc-comics.com/comic/the-talk-3
Like this one. ↩
The many-worlds interpretation is an intrepretation of quantum mechanics ususally compared with the Copenhagen interpretation. For the purposes of this article, it is enough to note that they both fit observations. See this Wikipedia article for a coverage of different interpretations of quantum mechanics. ↩
Computer scientists have grouped problems into different classes depending on their time and space requirements (we are mostly concerned with time here). Here are some of the key ones (if you are unfamiliar with the expressions “polynomial” and “exponential”, take them as “reasonable time” and “way too long” respectively):
P: Problems that can be solved in polynomial time.
NP: Problems that require exponential time to solve, but whose solutions can be verified in polynomial time.
NP-complete: Any problem in NP can be modeled as an NP-complete problem. This means if one NP-complete problem is solved in polynomial time, then all other NP problems can be solved in polynomial time, hence P NP. No such solution has been found yet, in fact it is generally believed (though not proven) that P NP.