Subscribe • Previous Issues
A DeepMind veteran on the future of AI and quantum
Quantum computing has always felt just over the horizon, so I’ve only tracked its progress from a distance. But that horizon is suddenly much closer: prototype machines with around 100 logical qubits are already tackling niche but valuable AI workloads, and startups are racing toward the 1,000-qubit mark. Early pilots in areas like recommendation systems, financial fraud detection, and drug discovery hint at computational speed-ups that classical hardware can’t match. The bottleneck is shifting from fundamental physics to the absence of a mature “QMLOps” software layer—exactly the kind of infrastructure problem that engineers steeped in AI and data pipelines are equipped to solve.
To understand what it will take to turn these lab demos into production systems, I spoke with Jennifer Prendki. With a PhD in particle physics from the Sorbonne and MLOps leadership experience at companies like Atlassian and DeepMind, she is uniquely positioned to bridge the two worlds. The heavily edited conversation that follows unpacks where quantum computing really stands, why now is the right moment for pragmatic builders to get involved, and the concrete steps teams can take to prepare their architectures.
Join our community of readers. Subscribe (free or paid) to get new posts and help us grow
Current State and Timeline
How close are we to useful quantum computing for AI applications?
While universal quantum computers capable of solving any AI problem may still be 10-15 years away, specific quantum applications for machine learning are emerging today. Companies like IonQ, D-Wave, and Rigetti are already working with governmental agencies like NASA on real use cases. Today’s leading machines operate at roughly 100 logical qubits, with startups like PsiQuantum targeting the 1,000-qubit range within a few years.
The key distinction is that quantum computers for specific AI inference tasks—particularly those involving structured data and requiring massive parallel computations—are becoming viable now, not in the distant future. We’re seeing the tip of the iceberg, with the main challenge being the gap between research and production rather than the fundamental technology itself.
What’s the reality behind the quantum advantage?
The quantum advantage comes from a fundamental difference in computation. Classical computers explore paths sequentially, while quantum computers leverage superposition to explore many paths simultaneously. This isn’t just a linear improvement—adding 10x more qubits yields an exponential jump in computational power due to quantum mechanics.
Large financial and pharmaceutical companies are investing in this expensive, early-stage technology because they’re already hitting the limits of classical computing for specific inference tasks. The bottleneck is speed and scale, and quantum offers a fundamentally different approach to overcome these limitations.
Near-Term Applications
Which AI/ML use cases show the most promise for quantum computing today?
Three areas demonstrate immediate potential:
- Recommendation Systems: Quantum computers excel at personalized, high-speed inference where models aren’t overly complex but require rapid transformations across massive user bases. Production-setting tests are underway to transform recommendation models into quantum machine learning algorithms. Examples: ContentWise; Recruit
- Financial Applications: Anomaly detection and fraud detection benefit from quantum’s ability to process vast numbers of transactions and identify subtle patterns in real-time. Credit card fraud detection, portfolio risk analysis, and routing optimizations are active areas. Examples: Deloitte; Caixa Bank
- Pharmaceutical and Precision Medicine: Drug discovery and personalized medicine applications leverage quantum’s ability to explore vast dimensional spaces quickly. These problems have incredibly high dimensionality but not necessarily huge data volumes, making them ideal for quantum approaches. Examples: Merck; Biogen
These applications share characteristics: they involve structured data, require exploring many possibilities in parallel, and face computational limits with classical hardware.
Why are companies investing now if the technology is still emerging?
These companies aren’t investing speculatively—they’re hitting real computational walls with classical inference. When you need to tailor medication to a specific patient or detect fraud in real-time across millions of transactions, classical computing approaches become prohibitively slow or expensive. They’re investing now to have working prototypes ready as more powerful quantum hardware becomes available in the next few years.
Technical Architecture and Infrastructure
What does the quantum computing stack look like compared to traditional ML infrastructure?
The quantum stack is fragmented and primitive compared to classical ML infrastructure. Key differences include:
- No standardized operating system for quantum computers
- No containerization equivalent to Docker or Kubernetes
- No mature CI/CD pipelines or orchestration tools
- Multiple competing hardware technologies (superconducting, trapped ion, photonic) requiring completely different approaches
- Extreme physical requirements: Some need near-absolute-zero cooling, while newer photonic approaches can operate at room temperature
The ecosystem lacks the standardized software infrastructure that enabled ML to move from research to production. This absence of a “QMLOps” framework is the biggest barrier to wider adoption.
How do quantum computers actually process machine learning workloads?
Quantum computers should be thought of as specialized accelerators—much like GPUs—rather than replacements for classical systems. The typical workflow involves:
- Classical systems store and manage the data
- Data gets encoded into quantum states through “quantum embeddings”
- Quantum processors perform rapid transformations
- Results are measured (which collapses the quantum state)
- Outputs return to classical systems for storage and further processing
This hybrid approach is permanent, not transitional. Quantum computers are brilliant at fast computation but “suck at remembering things.”
Data Operations Challenges
What’s the “no-cloning theorem” and why does it fundamentally break traditional data operations?
The no-cloning theorem is a fundamental principle of quantum mechanics stating you cannot create an exact copy of an unknown quantum state. A qubit exists in superposition (e.g., both 0 and 1 simultaneously), but observing it to see its value causes the superposition to collapse into a single state, fundamentally altering the information.
For data engineers and MLOps practitioners, the implications are staggering:
- No backups or replication: You cannot copy quantum data for redundancy
- No reproducibility: Errors might occur in one run but yield different outcomes when reproduced due to probabilistic measurement
- No traditional data lineage: Tracing data provenance becomes nearly impossible
- No model checkpoints: You can’t save and reload quantum model states
This forces a complete paradigm shift from working with static, versioned datasets to regenerating quantum states on demand.
How do you manage data when quantum states can’t be copied or stored?
The key is understanding that quantum computers are accelerators, not databases. The current approach involves:
- Storing classical data in traditional systems
- Loading data into quantum states for processing on demand
- Processing in quantum space using entanglement and superposition
- Measuring results (which destroys the quantum state)
- Storing results back in classical systems
Research into “quantum embeddings” aims to encode classical information more efficiently. With 100 qubits, the goal isn’t just storing 100 features but leveraging entanglement to encode far more information by representing relationships between features in high-dimensional Hilbert space.
Quantum computers are brilliant at fast computation but “suck at remembering things.”
Topological Data Analysis and New Paradigms
How does quantum computing change our approach from “extrapolation” to “modeling”?
Classical machine learning, even with LLMs, essentially performs sophisticated extrapolation from data points. We aren’t truly modeling the underlying process that generated the data. In contrast, physics seeks to explain fundamental processes—understanding the “why” behind observations.
Quantum computing enables thinking about data as having a “shape” or “data manifold” in high-dimensional space. Instead of discrete points, we can model the intrinsic structure of data using Topological Data Analysis (TDA). This approach:
- Captures the true underlying distribution of datasets
- Enables generation of high-fidelity synthetic data by sampling from learned manifolds
- Moves from modeling data points to modeling the data-generating process itself
TDA’s computational requirements have been prohibitive on classical hardware, but quantum computers are naturally suited for these complex calculations, potentially making this a killer application.
Talent and Skills Development
Do data engineers and scientists need PhDs in quantum physics to contribute?
Absolutely not. The core mathematics, particularly linear algebra, is already familiar to ML practitioners. The main challenge is adopting a different mental model:
- Superposition: Multiple states existing simultaneously
- Entanglement: Interdependencies between data points
- Measurement collapse: Extracting information destroys quantum states
- Probabilistic thinking: Moving from deterministic to statistical outcomes
The field desperately needs “bridge talent”—engineers who understand classical MLOps and can learn quantum circuit basics. It’s more like the shift from procedural to functional programming than starting from scratch.
Where can practitioners learn quantum computing for ML applications?
For practitioners looking to learn quantum computing for ML applications, the educational landscape is sparse but growing. A significant gap exists, as most quantum materials are tailored for physicists while standard ML resources overlook quantum concepts, meaning no comprehensive quantum ML courses are available on major platforms yet. Practitioners can gain hands-on experience through vendor SDKs like IonQ, AWS Braket, and IBM Qiskit, which provide simulators and limited hardware access. Following domain-specific resources like the “Quantum of Data” blog also provides valuable practitioner-focused content. Ultimately, there is an immediate need for educational materials that bridge these two worlds by specifically targeting ML practitioners rather than physicists.
Strategic Considerations
What are the risks of not investing in quantum computing capabilities?
The primary risks include:
- Competitive disadvantage: If quantum provides 100x speedup for your use case, companies without quantum capabilities may become uncompetitive.
- Security vulnerabilities: “Q-Day” represents the hypothetical moment when quantum computing technology becomes advanced enough to potentially make all existing digital data publicly accessible, compromising the security of personal communications, bank accounts, and other sensitive information.
- Talent scarcity: The pool of quantum-classical bridge talent is extremely limited and will become more competitive.
- Ecosystem lock-out: As standards emerge, late adopters may find themselves excluded from shaping critical infrastructure.
How does the geopolitical landscape affect quantum computing adoption?
The geopolitical landscape significantly affects quantum computing adoption, largely driven by major investment from the US and China in a rivalry similar to that seen in AI. This competition creates several key considerations for organizations. Much of the government funding is motivated by national security applications, which could lead to export controls that limit access to critical quantum hardware. Furthermore, talent in the field is extremely scarce and geographically concentrated, and open-source quantum efforts currently lag behind proprietary systems. As the US currently lacks a national program as cohesive as China’s state-level funding, organizations must factor these geopolitical elements into their quantum strategies.
What should CTOs and tech leaders do today?
- Identify candidate workloads where inference latency or combinatorial search dominates costs
- Engage hardware partners for exploratory runs—most offer credits and joint research agreements
- Build hybrid-stack readiness with container abstractions and orchestration supporting specialized accelerators
- Cultivate bridge talent through training programs and hiring
- Join standards conversations to help shape QMLOps before it ossifies
- Implement post-quantum cryptography regardless of quantum adoption timeline
The technology will eventually feel like better, faster infrastructure—much like GPUs accelerated deep learning—but only if we build the necessary software layer between quantum hardware and AI applications.
The post Superposition Meets Production—A Guide for AI Engineers appeared first on Gradient Flow.