Quantum computing is a topic that’s both frightening and extremely interesting for computer science majors. It’s an exciting field in computing because it’s a completely new paradigm that could help us find solutions to problems that are too difficult to solve with modern computers. However, it’s scary for programmers because it has the potential to put all off us out of work and render our skills useless.
Quantum computing is a very complicated topic, but at its core it leverages some quantum mechanical properties to change a fundamental convention in modern computers. That fundamental convention is the idea of bits. In a traditional computer, we define the state of a program through digital information that is encoded in binary numbers, a series of 1s and 0s that describes at a very low level everything from what our screen looks like, to what words appear on our screen.
The quantum computer’s corollary to a bit is a qubit. A qubit challenges the idea that a bit must either represent 1 or 0 and uses a principle of quantum mechanics known as superposition to let the value equal both 1 and a 0 at the same time. This curious property helps quantum computers perform their calculations faster by providing quicker access to answer amongst several billion possibilities. A bit gets its value by detecting a voltage, but a qubit’s value is determined by a phosphorus electron’s spin. This video flushes out the concept pretty well:
So while I joked earlier that quantum computers may put computer scientists out of work, that’s definitely not true in the near future. The reason quantum computing is so attractive right now, and the reason that companies like Google and IBM are willing to spend millions researching it, is because it is faster in certain applications such as factoring large numbers (which would make our current RSA encryption systems fairly obsolete) and creating and operating on enormous data sets (probably a huge goal of projects like Google D-wave).
Here’s one more video that does a really awesome job explaining some of the more intricate details of quantum computing (with cartoons!). This YouTube channel has a lot of really cool animated videos that I’ll probably use in later posts too.
I had been planning to write about quantum computing for my first real post for a few days now, but it seems like I got the timing down really well because just today, IBM scientists announced that they’ve made some notable advances in quantum computing. The nature of quantum computing makes debugging very difficult because we can’t observe the intermediate steps in a quantum process because for some very interesting reason our mere observation of the system would kill the process (something I might explain in more depth in a later post). However, scientists at IBM have come up with a way to deal with the problem of not being able to detect two types of errors simultaneously by having the computer correct its own errors.
So this is a really novel field of research and while it’s rapidly advancing technology that has even won a Nobel Prize in Physics, it’s unlikely that they’ll become as ubiquitous as PCs are currently in our lifetime. However, that’s all the more reason work harder at breaking expectations and making revolutionary innovations!