Quantum Computing, Between Promise and Practicality
.jpg)
Few technologies have sparked as much curiosity, and caution, as quantum computing. Long confined to theory and experiment, it now appears to be entering a new phase. But what exactly makes it different from conventional computing, and why does it matter?
A New Kind of Computation
Quantum computing represents a fundamentally different model of processing information. Rather than solving problems step by step, as classical computers do, quantum systems explore many outcomes simultaneously by applying principles of quantum mechanics, such as superposition and entanglement.
Imagine a marble maze. A regular computer is like rolling one marble at a time through the maze. Even with many workers (multi-threaded cores) rolling marbles in parallel, each still follows a single path, one at a time. It’s about checking every possible route until one finds the exit.
Quantum computers operate differently. Picture a wavelike marble moving down all paths in the maze at once. These wave-like computations interact: wrong turns cancel each other out, while the correct path reinforces itself. When the wave collapses, it doesn’t show you the whole journey, it simply gives you the answer.
This is not just a question of speed. Quantum computing offers a fundamentally new approach to solving complex problems—an entirely different kind of engine.
From Theory to the Lab
The promise, however, has taken over a century to mature. The mathematics began with Planck’s energy quanta and Schrödinger’s wave-function in the early 20th century. Yet it wasn’t until the 1980s that scientists posed the idea that nature itself might compute in quantum terms—and that simulating it would require quantum hardware. Experimental qubits began appearing in the 2000s, and commercial systems only emerged in recent years.
Tech historians will recognise the rhythm. Electricity required forty years of generators and grid cables before factories rewired. The internet spent decades in academic labs before reaching homes. Artificial intelligence lingered for sixty years before generative models captured global attention. But history also warns us that not all technological frontrunners last: Zeppelins once symbolised the future of air travel—until airplanes overtook them. Quantum technologies may face similar crossroads.
Fragmented Hardware, Common Goal
The technological landscape remains diverse. Superconducting circuits, used by IBM and Google, offer fast gate speeds but require near-absolute-zero refrigeration. Trapped-ion devices from Quantinuum and IonQ offer longer coherence times but operate more slowly. Photonic processors like those from PsiQuantum aim to run at room temperature but face challenges in component integration. Neutral atoms, pursued by companies like Pasqal, offer flexible connectivity. Microsoft is pursuing topological qubits, designed to be inherently more stable, while Amazon’s approach leverages superconducting cat qubits to intrinsically suppress certain types of error.
All of these systems are still noisy and limited in scale. We live in what researchers call the NISQ era (Noisy, Intermediate-Scale Quantum era) where devices can demonstrate useful behaviors but cannot yet run error-corrected, fault-tolerant workloads. Currently, the gap between prototype and scalable architecture remains significant.
Signals of Acceleration: Investment, Patents, and Plans
Public and private commitments suggest the field is advancing aiming to close this gap. Patent activity has surged, with more than 15,000 quantum-related patents filed globally, most within the past five years.1 McKinsey estimates a total addressable market of 173 billion dollars by 2040, driven by advances in chemistry, materials science, finance, and logistics.2
Global government programmes have now surpassed 42 billion dollars in cumulative investment.⁵ Also private capital has followed the science. Venture and private-equity flows into quantum start-ups have exceeded 4 billion dollars over the past five years.34 Industry leaders have become more vocal. IBM’s Arvind Krishna sees “something remarkable” happening in the next three to five years. Nvidia’s Jensen Huang, once sceptical, now describes quantum as nearing an “inflection point”. Alphabet CEO Sundar Pichai compares the current phase to artificial intelligence in the 2010s—a period marked by clear promise, but still significant technical hurdles.5
Detailed corporate roadmaps add to the picture. IBM’s Quantum Starling system, currently in development, is on track to deliver 200 logical qubits and 100 million error-free operations by 2029. Google, following a six-milestone plan, is working toward a system with one million physical qubits and anticipates “a real breakout” within the next five years. Microsoft’s roadmap leads to a quantum supercomputer capable of one million reliable quantum operations per second, calling 2025 “the year to become quantum-ready.” Quantinuum is targeting full fault tolerance by 2029, having stated in June 2025 that it had cleared what it described as the final major hurdle.6
Whether quantum computing is truly nearing a tipping point—or still climbing toward it—is not yet clear. Progress in the field is real, and accelerating, but technological progress is rarely linear. Hardware remains fragmented, commercial use cases are still early-stage, and fundamental challenges in error correction persist.
Yet one thing has changed decisively: quantum computing is no longer a theoretical exercise. It now attracts real capital, serious infrastructure, and global strategic interest. The maze may still be long, but the exit no longer feels imaginary.
Sources
1 EconSight (2025), Global Patent Database as of May 2025
2 McKinsey & Company (2024), Quantum Technology Monitor
3 McKinsey & Company (2024), Quantum Technology Monitor
4 Yole Intelligence (2024), Quantum Technologies 2024
5 Executive interviews & speeches from Time Magazine, GTC 2025, World Government Summit
6 Company roadmaps: IBM, Microsoft, Google, Quantinuum (2025)