Let’s face it, the world we live in is governed by computers. From the food that we buy to the vacations that we take, there is, at some point, a computer that made all that possible. Computers of the past, half a century ago or so, were big bulky machines that could fill an entire room and they could only do very basic computing. With better technology, it grew powerful but still remained huge.
Only top institutions like universities and hospitals could afford to buy one. That is until a little known technological leap made it possible for regular people, like you and me, to afford computers for personal use.
The invention of the transistor made a huge impact in the computing industry. A transistor is, at its most basic form, a device that is able to regulate the electrical current that passes through it – basically a gate or a switch. It has two states: “on” or “off.” These two states can be translated into binary language (0 and 1), a language that computers know how to speak.
Transistors replaced the space-consuming and very fragile vacuum tubes. Computers went from being a size of a room into something that sits on a desk. Now anyone can own a computer of their own.
The design of the transistor went through many evolutions but one thing was certain, it was getting smaller and smaller. The transistors that exist in our laptops, smartphones, and voice recorder are measured in nanometers, with the smallest measuring between 5 to 7 nanometers. Scientists believe we are reaching the limit to how small we can make transistors. Any smaller and a phenomenon is known as quantum tunneling takes place. This means the electrons are able to pass between transistors even in its “off” state because of how small it is.
Of course, there are still many different ways to make computers perform faster. We can make programs run more efficiently; we can design chip architecture to be more dynamic in its computing. However, many scientists believe that the solution to this computing limit is by developing quantum computers.
What is quantum computing?
We call the field of study that studies atomic and subatomic particles “quantum physics.” Scientists in this field deal with particles so small they have to use math to observe their properties. This field of science is still very new and have yet to understand and experiment on a lot of things. Despite this, there have been numerous studies that have broken ground in quantum physics. And with that came the application.
It is theorized that a quantum particle, known as qubits, can exist in many states. In a traditional sense, qubits could either be “on,” “off,” or “on/off.” That last part is known as superposition. The fact that qubits have more than 2 states means a quantum computer have greater computing power than any transistor-based computer can ever achieve.
Quantum computers are believed to benefit all industries including finance, healthcare, military activities, artificial intelligence, education, and many more. Quantum computing is going to help leapfrog human technology the same way that the discovery of fire helped homo sapiens became the dominant species on earth millions of years ago.
Traditional computing replacement?
So if quantum computing is so much better than traditional computing, where is it? How come we haven’t seen the likes of Apple, Qualcomm, and Intel released quantum chips?
The major hurdle with quantum computing has to do with what is known as quantum decoherence. The real-world affects the state of a qubit so it needs to be isolated to maintain reliability and accuracy. Add to that, decoherence is irreversible. Once a qubit decohere it could not return to a state of superposition, essentially making it useless for computing.
Current quantum computers need to cool their qubits to extremely low temperatures to prevent decoherence. The energy alone to achieve such extreme temperatures make quantum computers impractical for public use.