Quantum Computing - The Business & Opportunity
19th November 2025
Quantum computing is still a science project on most balance sheets. Revenue is tiny, machines are noisy, and most “deployments” look more like PhD experiments than production systems. And yet, if you zoom out a decade, the outlines of a new computing industry are already visible.
Last week’s edition, which you can find here, showed how we get from fragile qubits to fault-tolerant machines. This week’s asks the commercial questions:
Who in the value chain actually makes money?
Which sectors get real advantage first?
What has to happen technically and geopolitically for this to scale?
Where are the career and founder opportunities while everything is still early?
By the end, you should be able to talk about quantum not just as physics, but as a strategy.
The Value Map - Who Makes Money Where
The quantum stack looks a lot like the classical stack - materials → chips → operating systems → applications → services - just colder, fussier, and more vertically integrated. The key difference is where margin sits today versus where it’s likely to migrate.
Right now, value pools are clustered upstream: in rare materials, cryogenic systems, and IP-heavy hardware. Over the next decade, power is likely to flow toward cloud platforms and domain-specific applications, much as it did in classical computing.
Let’s walk the stack from bottom to top.
Inputs & Materials
At the base are suppliers of quantum-grade components: ultra-pure superconductors, isotopically purified silicon-28, precision lasers, vacuum systems, and dilution refrigerators that cool chips to millikelvin temperatures. Companies like Bluefors and Oxford Instruments dominate cryogenics; Toptica and Hamamatsu sell tunable lasers and photonic detectors; Edwards and Pfeiffer support ultra-high vacuum.
This layer is small but high-margin: there are few qualified vendors, each sale is expensive, and customers are mostly governments, hyperscalers, or well-funded startups. The moat here is operational: deep manufacturing know-how, long lead times, and tight integration with leading labs. Even if quantum computing under-delivers, these suppliers still benefit from investment in adjacent quantum tech like sensing and communication.
Enabling Hardware
The next layer is the hardware core - the companies actually building quantum processing units (QPUs) and their control electronics. This includes:
Superconducting qubits – IBM, Google, Rigetti
Trapped ions – IonQ, Quantinuum
Photonic – PsiQuantum, Xanadu
Neutral atoms – Pasqal, QuEra, ColdQuanta
Spin qubits – Intel, Quantum Motion
Around them sit control-stack specialists like Quantum Machines (sequencers and control electronics). There are also major cryo-CMOS efforts from classical chip makers such as Intel and NVIDIA - control electronics designed to operate inside the ultra-cold refrigerators used by quantum chips.
Today, this is where most capital and prestige concentrate. Hardware leaders own patents on qubit designs, fabrication recipes, and system architectures; they also win the lion’s share of government contracts and flagship corporate partnerships. But margins are not yet software-like. Systems are expensive to build, slow to amortize, and revenue is still largely R&D or cloud-access fees, not at-scale production.
Middleware & Software Platforms
Above the physics sit the software frameworks, compilers, and error-mitigation layers that turn devices into something developers can actually program. Here, IBM’s Qiskit, Google’s Cirq, and Microsoft’s Q# dominate the open ecosystem, while firms like Q-CTRL, Riverlane, and Classiq build specialized tooling for error suppression, circuit optimization, and automatic circuit synthesis.
This layer is where the industry quietly becomes usable. Azure Quantum, Amazon Braket, and IBM’s runtime environments now let developers submit hybrid jobs where classical Python orchestrates small quantum subroutines (‘kernels’) to run the most computationally difficult parts.
The moat here is developer mindshare and integration. Whoever becomes the “Windows of quantum” - the default tooling in universities and industry - will shape how algorithms are written and which hardware they target.
Cloud Platforms
In parallel, the hyperscalers are positioning themselves as aggregators of quantum capacity. As we saw last week, AWS Braket, Azure Quantum, and IBM’s Quantum Network already resell access to multiple hardware backends via a unified cloud interface - quantum as a (very premium) service.
This layer is likely to be one of the long-term profit centers:
It is asset-light relative to building hardware.
It benefits from demand aggregation and utilization economics.
It lets cloud vendors bundle quantum alongside AI, storage, and classical high-performance computing (HPC).
If quantum computing matures, most enterprises will not buy fridges; they’ll buy API calls. That pushes pricing power toward whoever owns the relationship with developers and CIOs - exactly where the hyperscalers already live.
Applications & Algorithms
On top of the stack sit domain-specific applications: chemistry solvers for drug discovery, optimization engines for supply chains, Monte Carlo accelerators for finance, quantum-enhanced ML for pattern recognition. Startups like Zapata Computing, QC Ware, Classiq, Qubit Pharmaceuticals, and Quantinuum’s software arm are already building this layer, often in tight partnership with enterprises.
This is where “who cares?” turns into “how much is that worth?”
For pharma and materials, shaving years off R&D cycles or unlocking molecules that classical methods cannot reach is worth billions.
For banks, better risk modeling and optimization on large books can move return on equity (ROE) by basis points that are worth fortunes.
For logistics and manufacturing, modest percentage improvements in routing and scheduling compound into huge cost savings.
Long-term, this layer likely captures outsized value - the way enterprise SaaS and AI applications capture more value than chipmakers, despite being built on their work.
Integration & Services
Most enterprises will not write quantum circuits from scratch. They’ll use systems integrators to stitch quantum APIs into their existing enterprise resource planning, supply-chain, and analytics workflows. Accenture, Deloitte, Capgemini, BCG, McKinsey, and IBM Consulting already run quantum advisory practices, often bundled with cloud-modernization or AI programs.
In the near term, this layer captures a lot of the actual cash: day rates for pilots, strategy decks, proof-of-concept integrations. It’s classic “sell shovels in the gold rush” economics. Over time, as quantum tools stabilize, this becomes a more standard systems-integration business – still valuable, but less uniquely advantaged.
End-User Industries
At the very top are the sectors that quietly benefit: pharma, chemicals, automotive, aerospace, energy, finance, logistics, and eventually parts of AI and cybersecurity. Many end-users won’t even know they’re “using quantum” - they’ll just see better simulation results, faster optimizers, or new features in familiar software.
For them, quantum isn’t a product; it’s a differentiator inside products. The moat comes from embedding quantum workflows deeply enough that competitors can’t easily copy the resulting IP or datasets.
What Unlocks This - Bottlenecks, Timelines & Signals
From a business perspective, quantum’s bottlenecks are less about buzzwords like “superposition” and more about operational reliability, manufacturability, and trust. The physics is hard, but the strategic question is: when does this stop being an experiment and start being infrastructure?
You can think about it in five constraints - each with concrete signals to watch.
Constraint 1: Fault-tolerance and usable logical qubits
Right now, most devices sit firmly in the NISQ regime – noisy intermediate-scale quantum machines: tens to low-hundreds of qubits where you can only run fairly short programs before errors take over. Error rates per two-qubit gate are typically 0.1–1%, which makes long computations impossible without error correction.
The unlock is reliable logical qubits whose error rates improve as you add redundancy. As mentioned last week, Google, IBM, Quantinuum, and others have recently demonstrated below-threshold logical qubits and early error-correction experiments; the consensus window for the first credibly fault-tolerant logical qubit is around 2026–2028.
Signals to watch:
Two-qubit physical gate errors consistently below 0.1%.
Announcements of logical qubits surviving 100+ logical gate operations - i.e., running a meaningful little program end-to-end - without their error rate creeping back up.
More companies shifting their public roadmaps toward logical qubits and target error budgets, not just raw physical qubit counts, and backing that shift with regular, data-backed progress updates.
When those metrics stabilize, quantum moves from being a probabilistic stunt machine toward something like a reliable co-processor.
Constraint 2: Scalable manufacturing and cost per qubit
Even if error rates improve, you still need a lot of qubits. Useful chemistry and cryptography workloads may require millions of physical qubits - or hundreds to thousands of logical qubits - to deliver clear advantage.
Today, qubits are effectively handcrafted. Superconducting chips depend on exotic materials and very low-volume fabs; ion traps require bespoke lasers and vacuum systems; photonic devices need advanced silicon photonics production. None of this looks like the high-throughput, high-yield world of conventional semiconductor manufacturing.
The unlock is moving from lab-grade hardware to something closer to industrial manufacturing:
Commercial fabs (GlobalFoundries, Intel) producing qubit wafers in volume.
Modular architectures that connect many smaller chips via photonic or microwave links.
Larger, more automated cryogenic systems that can host thousands of qubits without insane marginal costs.
Signals to watch:
Announcements of >10,000-qubit systems (even if noisy) from leading vendors.
Quantum chips being produced by mainstream chip fabricators with reasonable success rates.
Dramatic drops in cloud pricing per circuit execution (orders of magnitude over a few years).
When cost per useful qubit falls and scaling curves look more like Moore’s Law than bespoke art, the business case changes dramatically.
Constraint 3: Application maturity and software talent
A surprising bottleneck is not just hardware - it’s knowing what to run on it. Quantum algorithms for chemistry, optimization, and machine learning are still young, and the tooling has a “research-lab” feel.
The unlock is a stable software stack and talent base:
Mature software development toolkits and translation tools that hide hardware differences and error-mitigation tricks.
Hybrid frameworks where a small quantum subroutine (a ‘kernel’) acts like a specialized accelerator alongside GPUs - the classical program does most of the work, but sends the hardest part to a quantum chip.
A much larger pool of “quantum-literate” developers and domain experts.
Universities, big tech, and governments are pushing hard on workforce development; demand for quantum-skilled workers has nearly tripled since 2018, but supply is still tight.
Signals to watch:
Major enterprise dev tools (MATLAB, Wolfram, big-cloud AI platforms) adding quantum backends as standard, fully supported features.
A noticeable rise in junior-level quantum roles (vs just PhD/Principal Scientist listings).
Case studies where quantum applications are built and maintained by standard engineering teams with only a few quantum specialists.
Constraint 4: Market trust, crypto, and regulation
Quantum has a trust problem from two angles. On one hand, buyers don’t want to invest heavily in a technology that might end up delayed by many years. On the other, regulators and security leaders worry that large quantum machines will break today’s cryptography.
The technical solution is well underway: post-quantum cryptography (PQC) standards - new encryption schemes designed to resist attacks from quantum computers - are being rolled out, and organizations are beginning the long process of inventorying and upgrading their cryptographic infrastructure.
Politically, export controls and national-security concerns are starting to fragment global collaboration - especially between the U.S. and China - even as multi-national initiatives like the EU’s Quantum Flagship and consortia such as QED-C push for openness.
Signals to watch:
PQC widely deployed in major web protocols and financial infrastructure.
Clearer export-control regimes defining which quantum capabilities are restricted.
Large enterprises linking “quantum adoption” and “quantum-safe position” in the same strategy documents.
Once the security transition is mostly done, quantum looks less like a threat and more like an accepted part of the compute landscape.
Constraint 5: Funding cycles and quantum winter risk
Finally, there’s the capital-markets constraint. Quantum is expensive, slow, and uncertain - a perfect recipe for over-hype followed by “quantum winter”, a period where funding and enthusiasm dry up, if milestones slip.
So far, the trend is positive: patent filings have grown roughly 5× over a decade, venture funding hit record levels in 2024, and global public funding is in the billions. Market forecasts cluster around a direct quantum-computing market of perhaps $5–10B by 2030 and ~$50–100B by 2035 if technical progress stays on track.
Signals to watch:
VC funding for quantum staying above ~$2–3B/year with diversified bets, not just a handful of SPACs.
At least one quantum company delivering clear commercial ROI for a customer, not just pilots and press releases.
When delays happen (and they will), the narrative doesn’t collapse into “quantum is dead” - instead, coverage remains focused on steady technical progress.
In practice, the late 2020s are the crunch period: enough technical progress to avoid winter, or a reset in expectations if fault-tolerance slips.
Who Wins, Who Loses, and How the World Changes
Sector-level winners
Pharma and chemicals are the most obvious early winners. Their core problems - simulating quantum systems like molecules and materials - map almost perfectly to quantum hardware. Early pilots from Roche, AstraZeneca, BASF, Mercedes-Benz, and energy majors around battery and catalyst design are already under way, even on noisy devices.
Any company that can shorten drug-discovery cycles, discover new catalysts for carbon capture, or design better battery materials will enjoy defensible advantage, because the resulting IP is hard to reverse-engineer even if competitors also get quantum access.
Finance is next. Banks and hedge funds already pay for tiny edge improvements; they’re accustomed to black-box models and hybrid HPC workflows. Quantum can slot into risk modeling, option pricing, portfolio optimization, and cryptography. Firms like JPMorgan, Goldman Sachs, BBVA, and Fidelity already run internal quantum teams and publish on hybrid quantum-classical methods.
Logistics, manufacturing, and mobility sit close behind. Vehicle routing, crew scheduling, inventory planning, and factory optimization are all combinatorial nightmares where even modest improvements matter. D-Wave pilots with Volkswagen, Airbus, DHL, and others hint at where this could go as hardware improves.
Energy, aerospace, and climate-tech benefit from a double hit: better materials (for batteries, photovoltaics, superconductors) and better optimization (for grids, flight paths, and planning).
And over all of this sit the cloud hyperscalers and vertically integrated quantum players (IBM, Google, Microsoft, Amazon, Quantinuum) who can monetize at multiple layers: hardware, cloud access, software tooling, and services.
Sector-level losers (or at least, the squeezed)
Commodity HPC vendors whose value proposition is “more FLOPS for less money” may find some workloads siphoned off to quantum accelerators once these cross certain thresholds. They’re unlikely to disappear, but high-margin niches (e.g., specific chemistry or optimization clusters) could erode.
Security laggards – governments, banks, and infrastructure providers slow to adopt PQC – risk “harvest now, decrypt later” attacks as adversaries store encrypted traffic today and decrypt it with quantum machines later.
Mid-tier hardware hopefuls that lack either deep IP or cloud distribution may be trapped between hyperscalers’ vertical stacks and specialized component suppliers.
Countries that under-invest in quantum research and talent risk becoming digitally dependent - relying on other nations’ quantum technologies much as late adopters in semiconductors found themselves reliant on foreign fabs.
The good news: “loser” here mostly means relative loser. Quantum is more likely to reshape competitive dynamics than to annihilate entire sectors.
Macro implications: labour, geopolitics, environment, society
On labour, quantum amplifies the trend already seen with AI: rising demand for hybrid skill sets - physics + software, math + domain expertise - and for “translators” who can bridge between quantum teams and business leaders. The lack of talent is already a bottleneck; estimates suggest hundreds of thousands of quantum-skilled workers will be needed globally by 2030.
On geopolitics, quantum is joining semiconductors and AI as a core strategic technology. The U.S., China, EU, Japan, and others are pouring billions into national quantum programs; export controls and investment restrictions are starting to carve the field into blocs. Expect:
More government-funded flagship machines and national quantum clouds.
Tighter controls on dilution refrigerators, certain sensors, and high-end devices.
Quantum experts becoming geopolitical assets in their own right.
On the environmental side, the direct footprint of dilution fridges and specialized labs is real but small, especially compared to AI data centers. The big upside is indirect: better catalysts for carbon capture, more efficient batteries, optimized grids with higher renewable penetration, and improved climate models.
Finally, society and trust. Breaking legacy cryptography is the clearest risk; PQC rollout is the countermeasure. But there’s also a subtler issue: concentration of compute power. If only a handful of states and cloud giants control early fault-tolerant machines, advantage - in science, finance, and intelligence - will be unevenly distributed. That raises familiar questions from AI, but with even more opaque hardware.
Regulators will eventually have to answer questions like: Who gets access to very large machines? Under what oversight? How do we audit what’s being run on them? That conversation has barely started.
Career & Founder Opportunities
Quantum is still early enough that individual career and startup choices can materially shape the field. The trick is to avoid the two extremes: betting everything on speculative hardware you can’t influence, or waiting so long that the best niches are taken.
Early-career: become “quantum + X”
If you’re early in your career, the highest-leverage move is usually to pair solid fundamentals with one quantum-adjacent specialty:
Physics, Electrical Engineering, or Computer Science plus coursework or research in quantum information.
Chemistry, materials science, or finance plus enough quantum to collaborate with specialists.
Strong software engineering plus hands-on experience with Qiskit, Cirq, Azure Quantum, or Braket.
The industry has a shortage not only of theorists, but of people who can ship: write production-grade code, manage cloud infrastructure, or run large collaborations. Many roles will look less like “Quantum Wizard” and more like “Backend Engineer, quantum systems” or “Data scientist, quantum-accelerated workflows.”
Mid-career: become the translator or the builder of teams
If you already have 5–15 years in a domain - pharma, energy, automotive, finance, logistics - you’re in an enviable position. You don’t need to become a quantum physicist; you need to become the bridge:
Understand enough quantum to distinguish plausible roadmaps from hype.
Map your organization’s hardest computational problems to quantum-relevant patterns (chemistry, optimization, Monte Carlo).
Lead “exploration pods” that run pilots with cloud quantum providers and software startups.
Enterprises will pay a premium for people who can speak both CFO and “quantum” – translating between balance sheets and the quirks of qubits.
Founder opportunities: where to build now
Founders face the classic platform-shift dilemma: build too low in the stack and you’re competing with hyperscalers; too high and your product depends on hardware that doesn’t yet exist. But there are opportunities where solving today’s pain leads naturally into tomorrow’s upside:
Domain-specific quantum applications in chemistry, materials, and optimization
Start by wrapping classical and quantum-inspired algorithms in tools chemists, materials scientists, or logisticians actually want to use.
As hardware improves, swap in genuine quantum accelerators under the hood.
Developer tools, observability, and simulation
Better circuit debuggers, cross-hardware translators, cost estimators, and hybrid workflow managers.
High-fidelity simulators plus profiling tools that help teams decide when a problem is “quantum-ready.”
Quantum-safe and “crypto migration” platforms
Inventory and risk-assessment tools that scan enterprises for vulnerable cryptography.
Orchestration platforms that help roll out PQC across complex fleets with minimal disruption.
Education and workforce platforms
Bootcamps, MOOCs, and corporate training that teach “quantum for software engineers” or “quantum for chemists.”
Simulation-first learning environments that don’t require hardware access.
Vertical system integrators
Firms that specialize in, say, “quantum for automotive,” stitching together cloud access, software, and domain-specific models.
As with early AI consultancies, the best of these may evolve into product companies.
The unifying theme: start with hybrid value - where classical methods still do most of the work, but quantum is clearly on the roadmap - so you’re useful long before full fault-tolerance arrives.
Cross-Field Connections - How Quantum Plugs Into the Rest of the Frontier
Quantum does not evolve in isolation. Its most interesting effects show up when you consider how it interacts with AI, cryptography, advanced materials, biotech, and networking.
Quantum × AI
Two big intersections:
Quantum for AI – using quantum subroutines to speed up or improve certain machine learning tasks: sampling from complex distributions, kernel methods, or optimization for very high-dimensional models. Research is early but active, and vendors explicitly pitch quantum + AI bundles.
AI for quantum – using machine learning to design better pulse sequences, suppress noise, and automatically discover error-mitigation strategies. Here, AI is likely to have near-term impact by making today’s devices more usable and tomorrow’s devices easier to control.
Think of it as a feedback loop: the better quantum gets, the more ambitious AI experiments become; the better AI gets, the more we can tame noisy quantum hardware.
Quantum × materials, energy, and climate
Quantum’s most direct superpower is simulating quantum systems. That puts it on a collision course with:
Battery and materials design – better cathodes, solid electrolytes, high-temperature superconductors.
Catalysts for fertilizer production, carbon capture, and green hydrogen.
Photovoltaics and thermoelectrics with tailored band structures.
Breakthroughs in these areas feed back into the energy transition and climate mitigation, effectively turning quantum R&D into leverage on multi-trillion-dollar transitions.
Quantum × biotech and healthcare
Quantum intersects with healthcare via:
Protein folding and dynamics, where even small improvements in simulation accuracy can improve hit rates in drug discovery.
Medical imaging and optimization, where quantum algorithms might eventually help reconstruct images faster or design more efficient data-acquisition methods.
Hospital operations, where quantum optimization could be applied to staffing, operating-room scheduling, and supply management.
You can imagine future health-tech stacks where a physician’s decision-support system quietly queries both classical and quantum backends through a shared interface.
Quantum × cryptography and networks
The most publicized link is adversarial - quantum breaks current public-key cryptography - but the constructive side is just as interesting:
Quantum-safe algorithms (PQC) reshape how we think about long-term data security.
Quantum key distribution (QKD) and early quantum networks use quantum states to share encryption keys in ways that reveal any eavesdropping, creating new models for ultra-secure communication and for linking sensors across long distances.
Distributed quantum computing could eventually knit together many small quantum modules into something functionally larger - the quantum equivalent of today’s cloud clusters.
These developments blur boundaries between “quantum computing,” “quantum communication,” and “quantum sensing,” creating a broader “quantum technologies” sector rather than a single product category.
Quantum × everything-else infrastructure
Finally, quantum will quietly reshape traditional infrastructure:
Cloud data centers will add quantum bays alongside GPU racks.
Developer and operations tools will become “quantum-aware” - able to track which workloads depend on specific quantum hardware or encryption methods, so teams know what needs updating as the technology evolves.
Regulatory frameworks for AI, data protection, and cybersecurity will be amended to include quantum considerations.
Seen through that lens, quantum is less a standalone revolution and more a new axis in the existing tech stack - one more accelerator that changes what’s computationally feasible.
Bringing It Together
Quantum computing in 2025 is a paradox: technically immature, commercially tiny, yet strategically unavoidable. The hardware is noisy; the road to millions of qubits is long. But the pieces - from cryogenics and qubit designs to cloud access, software stacks, and early pilots in chemistry, finance, and logistics - are clicking into place.
For investors, strategists, and builders, the right stance is neither blind faith nor casual dismissal, but optionality with conviction:
Track the hard signals - error rates, logical qubits, manufacturing scale, PQC rollout.
Build hybrid workflows that add value today but are ready to plug into quantum accelerators tomorrow.
Accumulate talent, data, and problem formulations that will compound once machines cross the utility threshold.
If the first part of this deep dive was about how we get fault-tolerant qubits, this part is about what you do when they arrive. The inflection won’t be a single “quantum day.” It will look more like cloud or AI: years of steady, boring integration - and then, suddenly, everyone will claim they saw it coming.
Tell Me What You Think
That wraps up our two-part Quantum Computing Deep Dive - and once again I’d love to hear your feedback on both editions.
Your thoughts genuinely shape how I evolve these deep dives. Just hit reply and tell me what you liked and what could be improved. I read every message.
Can I Ask a Small Favour?
If this deep dive helped you understand quantum even a little better, please consider forwarding it to someone who’d appreciate it too.
Every share helps us reach more curious, ambitious readers.
Thanks for reading, and for being part of this journey.
See you soon,
Max
P.S. If a friend forwarded this, you can join free here to get future deep dives straight to your inbox.