A pioneering article in mainstream media from The Guardian opened the doors of Quantum computing as the technology that many scientists, entrepreneurs and big businesses expected to provide a, well, quantum leap into the future. What was true two years ago, it is even truer now.
The concept of quantum computing is relatively new, dating back to ideas put forward in the early 1980s by the late Richard Feynman, the brilliant American theoretical physicist and Nobel laureate. He conceptualized the possible improvements in speed that might be achieved with a quantum computer. But theoretical physics, while a necessary first step, leaves the real brainwork to practical application.
With normal computers, or classical computers as they’re now called, there are only two options – on and off – for processing information. A computer “bit”, the smallest unit into which all information is broken down, is either a “1” or a “0”. In the mysterious subatomic realm of quantum physics, particles can act like waves, so that they can be particle or wave or particle and wave. This is what’s known in quantum mechanics as superposition. As a result of superposition a qubit can be a 0 or 1 or 0 and 1. That means it can perform two equations at the same time. Two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some inconceivably large numbers, not to mention some mind-boggling working concepts.
In areas such as artificial intelligence and cryptography, quantum computing will transform the landscape, perhaps bringing about the breakthrough that will enable machines to “think” with the nuance and interpretative skill of humans. I will review these applications at my upcoming seminars on Quantum Computing put together by GLDNAcademy.