Quantum Computing - The Technology & Progress

November 12th 2025

For decades, quantum computing has been science’s most beguiling promise - a technology that could, in theory, crack the hardest problems in chemistry, physics, and mathematics by harnessing the strange rules of the quantum world. It has also been one of its most frustrating pursuits, trapped for years between elegant theory and stubborn engineering limits. Yet today, that balance is shifting. For the first time, quantum computing is beginning its long transition from lab trick to infrastructure - from fragile experiment to foundational technology.

This deep dive, split into two parts, explores both the science and the business of that transformation.

The first part, The Technology & Progress, traces the full arc of quantum computing’s evolution: how it works at the most fundamental level, why building a practical quantum computer has proven so hard, and the breakthroughs - in qubit stability, error correction, and scalable architectures - that are finally moving the field from theoretical to tangible. It also looks ahead at what still stands in the way of mass adoption: the remaining engineering barriers, the need for better algorithms and software, and the timeline for when quantum computing becomes useful beyond the lab.

The second part, The Business & Opportunity, shifts focus to the commercial frontier - examining how the quantum ecosystem is forming, where value will accrue across the hardware–software stack, and which sectors are likely to benefit first. It explores the companies building the hardware, the control and cloud layers turning physics into usable compute, and the applications - in chemistry, finance, and logistics - where quantum advantage could deliver real economic impact.

Together, these two parts show how a once-speculative field is becoming a platform for the next generation of computation. Readers will come away with a clear picture of how quantum computing actually works, the scientific hurdles being overcome, and the emerging business landscape behind one of the most important, and least understood, technological shifts of the 21st century.

If this has been forwarded to you, you can subscribe for free here to get more deep dives on the innovations shaping our world.

How It Works

Quantum computing is one of those subjects that seems designed to make you feel dumb. People throw around words like superposition, entanglement, and interference - and before long, it starts sounding like a cross between science fiction and spirituality.

But under the hood, quantum computing isn’t mystical at all. It’s just physics pushed to its limits.

Qubits: The “and” instead of “or”

At the heart of it all is the qubit, the quantum version of a bit. A classical bit, the digital lifeblood of your phone or laptop, can be either 0 or 1. A qubit can be 0 and 1 at the same time, a property called superposition.

The easiest way to picture it is as a spinning coin. While it’s in the air, the coin isn’t heads or tails - it’s both, simultaneously. Only when you catch it (or in quantum terms, measure it) does it “collapse” into one definite state. The difference is that in a quantum computer, the “spinning” happens in a mathematically precise way. A qubit can lean 70% toward 0 and 30% toward 1, or any other ratio.

The magic happens when you connect qubits together. Each qubit you add multiplies the number of possible configurations the system can hold, so the total grows exponentially. But unlike a classical computer, which holds only one configuration at a time, a quantum computer keeps all of them in play as probabilities - shaping and interfering with them until the right answer becomes the most likely one when measured.

Two qubits can represent 4 possible combinations at once. Ten qubits can represent 1,024. Fifty qubits can represent more possibilities than there are atoms in the observable universe.

That’s why quantum computing is so powerful in theory: even a modest number of qubits can explore a landscape of possibilities so vast that no classical machine could ever brute-force its way through it.

Entanglement: spooky teamwork

Superposition is what makes qubits powerful. Entanglement is what makes them weird.

When two qubits are entangled, their states become linked - even if they’re far apart. Change one, and the other instantly reflects that change in a correlated way. Einstein called it “spooky action at a distance”.

Entanglement lets qubits share information non-locally, allowing quantum algorithms to coordinate outcomes across many qubits at once. It’s the secret sauce behind phenomena like quantum teleportation and error correction - and it’s also what makes the math insanely hard.

Interference: nature’s way of voting

The final piece is interference. Once qubits are in a superposition of many possible states, quantum algorithms work by nudging some outcomes to reinforce each other and others to cancel out.

You can think of it like waves in a pond: when ripples overlap, some peaks add up, others cancel out to flat water. Quantum interference is the same, but with probabilities instead of water. The art of designing a quantum algorithm lies in choreographing that interference so that the correct answers remain standing when you measure the system - and the wrong ones vanish.

Hardware: the zoo of quantum machines

So far, so good. But here’s where things get wild.

Because qubits are just physical systems that can hold two quantum states, there are many ways to build them - and the industry still hasn’t settled on a winner.

  • Superconducting qubits, used by Google and IBM, rely on tiny circuits cooled to near absolute zero, where electrons pair up and flow without resistance. They’re fast and familiar (engineers love them because they behave like electrical circuits), but they require massive cryogenic systems to stay stable.

  • Trapped ions, favored by IonQ and Quantinuum, use charged atoms suspended in electromagnetic fields. They’re beautifully precise and naturally identical - but slower and harder to scale.

  • Photonic qubits, the bet of PsiQuantum and Xanadu, use individual photons of light. They don’t need freezing temperatures and might scale well using optical chips, but their control systems are still experimental.

  • Neutral atoms and topological qubits are newer frontiers, each with unique advantages and engineering nightmares of their own.

Right now, no one knows which approach will win. It’s like the 1950s of classical computing, when vacuum tubes, relays, and transistors all competed for dominance - or like early electric cars before Tesla standardized the playbook.

A New Kind of Computing

If classical computers are like methodically checking every road on a map, quantum computers are more like dropping millions of marbles onto a landscape of hills and valleys all at once, then watching which paths the marbles take most often as they settle into the lowest points. Over many repetitions, the best routes reinforce each other - and the system naturally converges on the optimal answer.

That’s what makes quantum computing so powerful and so challenging: you’re not just programming logic; you’re orchestrating probabilities.

Why It’s Been So Hard

Quantum computing is one of those ideas that looks simple on the whiteboard and almost impossible in the real world.

The concept, harnessing the bizarre rules of quantum mechanics to process information, has been understood since the 1980s. The challenge has always been turning the physics into an actual machine that works longer than a few microseconds.

At the heart of the problem is a single word: fragility.

The Enemy: Decoherence

Qubits are delicate. They must remain in superposition, that magical blend of 0 and 1, long enough to perform calculations. But the tiniest interaction with the environment - a stray photon, a vibration, a flicker of heat - can cause a qubit to “decohere,” collapsing it back to a definite state.

It’s like trying to keep a soap bubble intact while performing surgery inside it.

Even today, the best superconducting qubits can maintain coherence for only a few hundred microseconds, while trapped-ion qubits can remain stable for several seconds under ideal conditions.

Errors Everywhere

On top of fragility, there’s noise - random fluctuations that subtly distort calculations. Every time a quantum gate (an operation that changes a qubit’s state) is performed, there’s a small chance of error.

In classical computing, a bit flip from 0 to 1 can be corrected easily by redundancy - you can copy, verify, and majority-vote your way to accuracy. But quantum information can’t be cloned directly (a rule called the no-cloning theorem). That makes traditional error correction impossible.

The result? Even small amounts of noise quickly cascade into incorrect results, rendering long computations useless.

Error Correction: The Mount Everest of Quantum

To overcome this, researchers developed the idea of quantum error correction - encoding one “logical” qubit using dozens, hundreds, or even thousands of “physical” qubits so that errors can be detected and corrected without destroying the underlying quantum information.

The theory works beautifully on paper. In practice, it’s brutally demanding. To run a single fault-tolerant logical qubit today might require thousands of physical qubits. And that’s just for one.

Most of today’s leading systems have 100 to 1,000 physical qubits - enough for experiments, not for the error-corrected architectures needed to run real algorithms. That’s why so much of the field’s energy has gone into improving fidelity - the precision with which each quantum operation is executed. Every extra decimal point of reliability reduces the overhead needed for error correction.

Engineering Meets Physics

Then there’s the engineering. Superconducting qubits require dilution refrigerators colder than deep space. Trapped-ion systems need ultra-high vacuum chambers and laser arrays with nanometer precision. Photonic qubits rely on perfectly aligned optical circuits.

Scaling any of those setups from tens of qubits to thousands - with each one isolated, controlled, and synchronized - is one of the hardest engineering challenges humans have ever attempted.

In the next section, we’ll look at how after decades of incremental gains, the breakthroughs in the last couple of years represent a real inflection point - not just bigger qubit counts, but better, more stable, more connected systems that might finally break through quantum’s decades-long barrier of fragility.

The Inflection: 2024–2025 Breakthroughs

For years, the quantum computing story felt like déjà vu: every new milestone arrived wrapped in the same headline - “A Step Closer to Practical Quantum.”

But in 2024 and 2025, that cautious optimism started to sound different. The headlines didn’t just announce bigger qubit counts or faster gates; they described qualitative improvements - the kind that move a technology from laboratory curiosity to commercial inevitability.

The Quality Revolution: Beyond Counting Qubits

In the 2010s, quantum progress was measured by how many qubits you could squeeze onto a chip. IBM, Google, and Rigetti traded bragging rights as they scaled from 5 to 50 to 100 qubits. It was impressive - but misleading. Adding more noisy qubits to a fragile system didn’t bring the world closer to useful computation; it just made the noise louder.

The breakthrough came when the focus shifted from quantity to quality.

IBM’s 2023 Heron processor chip marked the beginning of that pivot. Unlike its predecessor Eagle, Heron wasn’t about chasing record qubit counts; it was about fidelity. With a modest 133 qubits, it achieved unprecedented gate reliability - enough to run deeper circuits and sustain coherence for longer stretches. Then came Condor, IBM’s first 1,121-qubit chip - a milestone not just of scale but of architectural maturity, designed to serve as a testbed for modular quantum systems.

In parallel, Google unveiled Willow, a new platform demonstrating “quantum echoes” - the ability to verify computational advantage on real tasks, not just contrived benchmarks. After years of skepticism over the term quantum supremacy, Willow marked something subtler and more meaningful: the start of useful quantum performance.

The Road to Error Correction

In 2024, Quantinuum demonstrated a logical qubit that could survive for longer than its physical components - the first experimental proof that error correction could extend coherence in practice, not just in theory. Around the same time, Paris-based startup Alice & Bob pushed forward with “cat-code” qubits - a design that encodes quantum information in multiple photon states, dramatically suppressing error rates.

Meanwhile, researchers have refined quantum low-density parity-check (QLDPC) codes, which spread quantum information across many qubits in a sparse, overlapping pattern. If a few qubits fail, the code can still reconstruct the original state - dramatically reducing the amount of redundancy required. These QLDPCs point to fault-tolerant systems that could need only tens of physical qubits per logical one, rather than thousands. It’s the kind of breakthrough that could make large-scale quantum computing feasible within a decade.

New Hardware

Beyond superconductors and trapped ions, entirely new qubit architectures are starting to show teeth.

Neutral atom systems, led by companies like QuEra and Pasqal, are scaling rapidly, thanks to their ability to trap and manipulate hundreds of atoms in laser-generated grids. These systems have shown remarkable coherence times and reconfigurability - two of the hardest problems in quantum design.

Photonic qubits are also gaining momentum. PsiQuantum and Xanadu both advanced integrated photonics platforms that use light, not matter, to represent qubits - offering potential scalability and resilience at room temperature. The field is no longer a single horse race; it’s a multiverse of competing physical paradigms, each inching toward a viable commercial platform.

Software, Cloud, and Ecosystem Maturity

Hardware isn’t the only frontier that’s maturing. The quantum software stack, long the neglected sibling of physics, is rapidly professionalizing. Software development kits (SDKs) like Qiskit, Cirq, and Braket SDK are integrating with Python-based data-science workflows. Hybrid algorithms that pair classical and quantum computation are making it possible to extract partial value even from today’s noisy systems.

At the same time, quantum has gone cloud-native. AWS Braket, Microsoft Azure Quantum, and Google Cloud are now hosting live quantum devices accessible via API - turning what was once lab hardware into an on-demand service layer. That shift is pulling real enterprises into the fold: pharmaceutical companies testing molecular simulations, logistics firms exploring quantum optimization, and banks running portfolio risk analyses on early quantum accelerators.

The Policy Tailwinds

Governments, too, are betting on the transition from “theory” to “tool.” The U.S. CHIPS and Science Act earmarked billions for quantum R&D. Europe launched its Quantum Flagship 2.0, while China and Japan continue to scale national programs. In parallel, the U.S. National Institute of Standards and Technology (NIST) began formalizing post-quantum cryptography (PQC) standards - the defensive counterpart to quantum’s offensive computing power.

Together, these developments signal that quantum is no longer a pure science project; it’s a strategic capability - and one that major economies now see as foundational to long-term competitiveness.

Next, we’ll look at what’s left - the final hurdles before quantum computing becomes as boringly dependable as the cloud.

What Still Needs to Happen (and How We’ll Get There)

By now, the challenge is clear. As we saw earlier, the problem was never understanding quantum physics - it was building machines stable enough to harness it. Recent breakthroughs have shown that while individual qubits are still fragile, improved architectures and error correction now let quantum systems stay coherent long enough to do meaningful work. The next step is scaling them into something useful, repeatable, and affordable.

Fault Tolerance

The industry’s north star is fault tolerance - a computer that can automatically detect and correct its own quantum errors faster than they appear. That’s the milestone that turns an experimental setup into an industrial tool.

Most roadmaps - IBM’s, Quantinuum’s, Google’s - point to a first fault-tolerant logical qubit by 2028. IBM’s Heron → Condor → Flamingo → Kookaburra roadmap explicitly targets this: higher-fidelity qubits connected through modular cryogenic systems. Startups like Alice & Bob (cat-code superconducting qubits) and BlueQubit (software-side error mitigation) are attacking the same goal from different angles.

Three engineering fronts are pushing us there:

Hardware refinement. IBM, Google, and IonQ are all in a quiet race toward “five-nines” gate fidelity - 99.999 % reliability per operation. Google’s Willow qubits already exceed 99.9 %, while IonQ’s Forte II trapped-ion system has reached 99.99 % two-qubit fidelity with coherence times in seconds. Each extra decimal point trims the number of physical qubits needed for error correction, multiplying overall efficiency.

Cryogenic and control electronics. Intel’s Horse Ridge II chip integrates control electronics inside the fridge, cutting latency by roughly 100×. Zurich Instruments and Oxford Quantum Circuits are doing the same for modular systems, embedding signal-generation and feedback directly into cryogenic boards. This isn’t glamorous physics, it’s precision plumbing, but it’s what turns delicate experiments into reproducible products.

Modular architectures. IBM’s Quantum System Two, launched in late 2024, is a literal data-center-grade refrigerator that can host multiple quantum chips connected by microwave and optical interlinks. Neutral-atom players like Pasqal and QuEra are doing the same at room temperature, using arrays of hundreds of laser-trapped atoms that can be rearranged and networked. It’s the beginning of a distributed-quantum model: clusters of small, high-quality chips acting as one machine - exactly how classical cloud computing scaled.

Interconnects and Integration

Of course, linking chips only works if they can talk to each other - and that’s one of the hardest engineering problems left to solve. An interconnect has to move quantum states between chips without destroying the delicate superpositions that make it useful.

PsiQuantum and Xanadu are betting on photonics, developing single-photon sources that can carry quantum information through optical fibers. In parallel, Rigetti and QuantWare are experimenting with microwave-to-optical converters that could link superconducting chips across fridges. European consortia like QuNet (Germany) and UK Quantum Network are building early “quantum intranets,” linking university nodes with secure entanglement channels.

The goal isn’t flashy - it’s to make qubits communicative, turning isolated experiments into distributed systems. Once interconnects mature, modular scaling becomes a matter of adding racks, not reinventing physics.

The Software Maturity Gap

Hardware progress is meaningless without software that ordinary developers can use.
Today’s quantum code looks more like lab scripts than production software - but that’s changing fast.

Microsoft’s Azure Quantum Elements and Amazon Braket Hybrid Jobs now let users combine classical and quantum routines in a single workflow, hiding the quantum layer behind Python APIs. Classiq, Riverlane, and Q-CTRL are building compilers and error-suppression layers that automatically optimize circuits for the quirks of each hardware platform. On the open-source side, IBM’s Qiskit 2.0 and Google’s Cirq are converging toward standard formats, allowing the same quantum algorithm to run on different systems.

This software stack maturity is the unseen catalyst. Once tools, simulators, and SDKs stabilize, enterprises can focus on business logic instead of calibration curves - the same transition that turned cloud computing from a science project into a utility.

Early Adoption

For all the progress and publicity, quantum computing today remains in its pilot phase - a technology straddling the line between research and business. The machines are real, the software stack is maturing, but the workloads are still narrow, specialized, and expensive to run.

Most commercial engagement happens through cloud access programs, where enterprises buy time on quantum processors hosted by IBM, IonQ, Quantinuum, or AWS Braket. These programs serve a dual purpose: they fund ongoing hardware R&D while giving businesses hands-on experience with the algorithms of the future.

The most serious early adopters fall into four sectors - chemistry, materials, finance, and optimization - each experimenting in its own way.

Chemistry and Materials

This is where quantum’s near-term promise is most tangible. Pharmaceutical and materials companies are using small-scale quantum simulators to model the molecular structures that determine how drugs bind or how batteries store energy - problems that scale exponentially on classical computers.

Roche, AstraZeneca, and BASF have run pilot studies on IBM’s and Quantinuum’s platforms, testing quantum models of protein folding and lithium-ion chemistry. TotalEnergies and ExxonMobil are exploring quantum-enhanced catalysts for carbon capture. The results aren’t yet faster or cheaper than classical methods, but they’re reaching comparable accuracy for small molecules - a milestone that proves quantum simulation is beginning to add value, not just intrigue.

Finance

If chemistry is about precision, finance is about speed and optimization - and that’s where quantum could eventually shine.

Major banks and hedge funds have been among the earliest to experiment, seeing quantum not as a science project but as a potential alpha engine. JPMorgan Chase, Goldman Sachs, and BBVA all maintain internal quantum research teams. JPMorgan’s group, working with IonQ and Quantinuum, has tested quantum algorithms for portfolio optimization and risk management - showing that hybrid quantum-classical routines can match classical solvers on smaller datasets.

Goldman Sachs and QC Ware have co-developed quantum Monte Carlo methods to accelerate option pricing, while Fidelity and HSBC have funded research into quantum-resistant cryptography to future-proof trading infrastructure.

The short-term reality: classical high-performance computing still dominates. But the medium-term opportunity - running simulations, pricing, and optimization on quantum accelerators integrated into cloud finance platforms - remains one of the most commercially compelling use cases once error rates fall and hardware stabilizes.

Optimization and Logistics

Quantum annealers (optimizers) from D-Wave and hybrid solvers from Zapata AI and QC Ware are already being tested for route optimization, scheduling, and supply-chain management. Volkswagen, Airbus, and DHL have each published pilot studies showing modest but measurable improvements in fleet routing and inventory planning.

Collectively, these initiatives represent a market still below USD 1 billion in annual revenue - but a much larger one in engagement. Each pilot adds new developers, new libraries, and new mental models to the ecosystem.

Timelines & Milestones to “Boringly Useful”

Every emerging technology goes through the same emotional curve: disbelief, hype, disillusionment, then quiet ubiquity. Quantum computing has spent thirty years in the first three stages. The next decade will decide if it finally reaches the fourth - the point where it stops being a headline and starts being plumbing.

2025–2028: The First Fault-Tolerant Qubits

The near term will be defined by a single milestone: the first verified, fault-tolerant logical qubit. Quantinuum, IBM, and Google are all on trajectories that point to roughly 2026–2028 for credible demonstrations.

It won’t mean “general-purpose” quantum computing overnight, but it will be the Wright Brothers moment - the first powered flight, small and shaky but undeniably real. Investors and policymakers will treat it as the transition point from physics to engineering.

2028–2030: Early Commercial Advantage

The next phase will bring application-specific advantage - narrow cases where quantum outperforms classical methods on meaningful problems. Expect the first wins in quantum chemistry (drug discovery, catalyst design) and materials science (battery optimization, superconductors), followed by finance and optimization as hardware matures.

2030–2035: Integration, Standards, and the Quiet Revolution

Once early advantage is proven, the rest of the ecosystem moves quickly. Quantum will shift from bespoke access to standard cloud infrastructure.

Standardization will accelerate adoption. PQC (now rolling out via NIST) will make enterprises quantum-safe by default. Interconnect standards for photonic and microwave links will let different quantum modules talk to one another. And development frameworks like Qiskit, Cirq, and Classiq will converge so developers can write one program and run it on any quantum machine.

Many industry voices expect that by the mid-2030s, we will move into what IBM’s Jay Gambetta calls the era of ‘boringly useful’ quantum computing - becoming a background capability baked into cloud stacks and largely invisible to most end users.

Next Time

Now that you understand why quantum computing has been so difficult to build, the breakthroughs that have changed its trajectory, and the engineering challenges companies are now racing to overcome - that’s only half the story.

In the second part, we’ll shift the lens from science to business and explore the commercial landscape taking shape around this new platform: from tech giants like IBM, Google, and Microsoft to startups building qubits, control systems, and quantum-native software. We’ll look at where the value will accrue, who’s best positioned to capture it, and what it will take for quantum computing to move from headline to infrastructure.

Not a subscriber and want the second part and future deep dives delivered straight to your inbox? Join for free here.

Your Thoughts

This is just the third installment in our new deep-dive series, and I’d love to hear what you think. In particular, let me know if there’s anything you think could be improved.

Just hit reply - I read every message personally, and your feedback directly shapes how these deep dives evolve.

Share It Forward

If this deep dive gave you a clearer picture of where quantum computing is heading, or helped make the science finally make sense, please share it with one person who’d appreciate it too.

That small act helps us reach more curious, optimistic readers and keeps this project growing.

See you soon,
Max

P.S. If a friend forwarded this, you can join free here to get the second part and future deep dives straight to your inbox.

Reply

or to participate

Recommended for you

No posts found