# Limits of the Effeciently Computable: Rise of the Quantum Computers

“I don’t believe in Computer Science.”~ Richard Feynman at Bell Labs (1985).

Feynman argued that science is the study of the behavior of the nature. He pointed out the limitations of classical computers for simulating the nature, which is very difficult because nature is quantum mechanical. At his lectures at Caltech, Feynman talked about the limitations of computation due to mathematics, noise, thermodynamics, reversible computing (where it is possible to compute and un-compute), and quantum mechanics. He predicted a new type of computer that is not a Turing machine, which he called a quantum computer.

Intel co-founder Gordon Moore noticed that computer speed is getting doubled in every 18 months. This observation is known as the Moore’s law. To double the speed the number of transistors in a silicon chip needs to be doubled. As a  result, every 18 months the size of the transistors are reduced as half of their original size. So, if the trend continues by 2020, every transistor will have to be the size of an atom. The obvious question is what happens after that. Many predicted that this will be the end of Moore’s law, but in reality it is the start of something new. We are now transitioning to the quantum regime and quantum computers are predicted to have exponential speed up over classical computers.

A classical computer consists of billions of transistors, where each transistor can either be switched on or off representing the digital information 1 or 0. Instead of transistors, a quantum computers has quantum bits or qubits. A qubit is the unit of quantum information. A qubit can be 1 or 0 or a superposition of both at the same time, a property which is fundamental to quantum computing. As the operation is performed, the qubits slowly drop out of their superposition state and choose a classical state, either 1 or 0. At the end, each qubit must choose to be either 1 or 0. As an example, we have classical computers with just 2 transistors and for a problem transitor1=1 and transistor2=0 is the solution. So, we can try 4 times for 4 possible states of the transistors and one of them will give the solution and the worst case solution will require 4 CPU cycles. However, in a quantum computer both qubits will start from the superposition state and end up as qubit1=1 and qubit2=0 in just one CPU cycle. A quantum computer with 2 qubits can represent 4 classical possibilities at the same time, 3 qubits can represent 8, etc. A quantum computer is inherently parallel and exponentially fast. Each additional qubit doubles the computational power. A 30 qubits quantum computer will be faster than the world’s most powerful super computer, and a 300 qubits quantum computer will be more powerful than all computers in the world connected together!

The Millennium Prize Problems are seven problems in mathematics that were stated by the Clay Mathematics Institute. They are P versus NP, the Hodge conjecture, the Poincaré conjecture, the Riemann hypothesis, Yang–Mills existence and mass gap, Navier–Stokes existence and smoothness, and the Birch and Swinnerton-Dyer conjecture. Among them the Poincaré conjecture is proved. Informally, the P versus NP problem asks whether or not every problem whose solution can be quickly verified by a computer can also be quickly solved by a computer. This problem is of particular interest because if the answer to this question is yes then other six Millennium Prize Problems are also solved, since one can just program a computer to solve them. Of course, most of us already believe that the answer to the P versus NP problem is no. But does the answer still hold for a quantum computer? Can we exploit the exponentiality, which is inherent in the quantum world, to give a positive answer to this question?

[Ref: Michelle Simmons, Tony Hey, Scott Aaronson]