Connecting the world


Quantum computing may be entering its practical utility era, but full commercial advantage isn’t here yet.

Though quantum computing has moved past its futuristic, sci-fi-esque, lab-based experiments into actual early-stage practical usefulness, it hasn’t yet reached the point where it definitively outperforms classical computing for commercial problems. However, that moment is apparently close at hand.

During a roundtable hosted by Economist Impact, moderator Steve Suarez, CEO of HorizonX Consulting, spoke to Michael Biercuk, CEO of Q-CTRL; Elham Kashefi, Chief Scientist at the UK National Quantum Computing Centre; Ashley Montanaro, Co-founder and CEO of Phasecraft; and Simon Fried, VP Corporate Communications at Classiq, about where we are in the race for quantum utility.

What is quantum utility?

The language around quantum has matured significantly in recent times. For years, claims of “supremacy” or “advantage” dominated conversations. These terms were useful for grabbing headlines, yet all they meant was that at some point these machines would outpace their classical counterparts. But as the space evolves, the framing is focused on the much more grounded concept of “utility”.

Biercuk explains that utility sits somewhere between supremacy and advantage – meaning the results coming out of quantum machines are genuinely competitive with those of a supercomputer, even if the evidence isn’t yet definitive that quantum is better. Kashefi explains that utility is not a single fixed concept, but one that depends on context: sometimes the metric is accuracy, sometimes energy cost, sometimes speed, sometimes privacy. What this means is that conversations are turning from quantum “winning” to asking, what is quantum actually useful for, and what problems is it already good enough to tackle.

Montanaro notes that quantum computers are actually already solving “really pretty challenging problems that the best classical algorithms in the world are having to really struggle to match.” He added that there is now genuine debate about whether the quantum computer might already be preferable to a classical one in some situations – not just because it’s faster, but because it can offer a more accessible interface than a classical algorithm that might otherwise take a PhD student six months to implement.

The software stack is where the race is at

The quantum race is no longer purely about qubit counts. Instead, it is all about algorithm design, control software, benchmarking, and integration with classical and AI workflows.

The emerging quantum stack has distinct layers, each of which is doing a different job. At the top are companies that design quantum algorithms, the mathematical instructions that tell a quantum system what problem to solve. In the middle you have the companies that allow developers to synthesise and scale those algorithms without needing to work at the level of individual quantum gates. At the hardware interface level are companies which operate at what Biercuk calls “the last mile”, which is the infrastructure software that translates high-level instructions into real machine commands, reducing errors and delivering orders of magnitude better performance from the hardware.

This is already having a measurable commercial impact. Softbank, after running workloads using Q-CTRL’s software on IBM hardware, concluded that quantum will play a crucial role in its commercial operations in the coming years. “Softbank … are making long-term strategic commercial decisions based on what they are seeing,” says Biercuk.

What is quantum already being used for?

Use cases are already beginning to stack up, and each demonstrates not just that quantum offers an advantage, but why.

Fried explains that Classiq recently worked with Comcast and AMD on network resilience optimisation – modelling how a communications network could best reroute data if nodes were taken down by threats, equipment failure, or routine maintenance. This complexity of optimising a large network in real time is the kind of problem where classical methods hit practical limits.

BMW faced a similar challenge: packaging a cooling system within the constrained 3D geometry of an electric vehicle, minimising weight, reducing components, and optimising space. This is another area where classical computing would struggle as the number of variables grows.

Phasecraft has been working with the UK’s National Energy System Operator on energy network optimisation, and on a technique called quantum-enhanced DFT (density functional theory), which models the quantum mechanical behaviour of materials such as battery cathodes and solar cell components in ways that can replace months of laboratory work. Montanaro predicted that within five years, “quantum computing actually will be a standard tool for modelling these systems.”

Is quantum sensing already past utility?

While quantum computing works towards demonstrating an unambiguous commercial advantage, quantum sensing has been quietly advancing. Biercuk notes Q-CTRL’s demonstration of a quantum-enabled aircraft navigation system that delivered GPS-free positioning 100 times more accurate than the best conventional alternative – tested in a real-world flight. He called it “the first truly convincing demonstration since atomic clocks in the 1960s,” of quantum sensing delivering a definitive advantage.

Other near-term sensing applications include minerals prospecting, underground target detection, and medical imaging. “The customers [in these spaces] know what they need,” says Biercuk. “They need GPS backups. We’re able to deliver that right now.”

No single hardware platform has won

After two decades into serious quantum hardware development, and no single qubit modality – superconducting, trapped ion, neutral atom, photonic – has pulled ahead. But that’s ok. Nobody knows if any single qubit modality will dominate, and “anybody who tells you they know is selling you something,” says Biercuk.

Montanaro notes that he had expected significant hardware consolidation when he began his PhD in the mid-2000s and had been “consistently proven wrong” ever since – platforms in common use today weren’t even on the radar back when he started. Kashefi describes the UK National Quantum Computing Centre’s approach as a “United Nations of qubits,” actively working across all modalities to understand which platforms suit which applications best. Even Google, one of the largest investors in superconducting quantum hardware, recently announced it was expanding into neutral atom systems.

In practical terms, the implication is that software abstracts away hardware dependency, freeing users to select whichever platform best suits a given problem, meaning that end users don’t need to choose a hardware platform any more than Cloud computing customers need to choose a server architecture. “[I]t takes the entire ecosystem to raise a useful quantum application,” says Kashefi.

Government procurement is a catalyst

The UK government‘s decision to become a quantum customer rather than simply funding research is something the panel agreed was a positive action. The development of GPUs was shaped by defence contracts. Night vision, stealth technology, and the early mobile communication standards all received crucial early demand from public-sector customers willing to buy before the technology was fully mature, and quantum is following the same pattern.

Biercuk, who previously worked at the Defense Advanced Research Projects Agency (DARPA), highlights that “the power of government procurement to transform an emerging sector is second to none.” Kashefi adds that the UK’s quantum ecosystem has been building for more than 15 years, and the procurement move is partly designed to retain talent and competitive advantage: “We just don’t want to hand it over to other nations.” Fried notes that the UK’s approach, relative to other European countries, better reflects the need to shift from funding hardware experiments to building end-to-end computational workflows and communities of competent end users.

The next five years

Looking three to five years ahead, the panellists believe that quantum will stop being a specialised research tool and start becoming infrastructure.

Montanaro forecasts that quantum computing will have contributed to the discovery or optimisation of real physical materials – batteries, solar cells – that would not have been found using classical methods alone.

Kashefi predicts that quantum-AI integration will begin to address the energy costs and privacy limitations of large language models, unlocking what she calls “privacy-preserving, low-energy AI.”
Fried envisions quantum becoming simply part of the computational toolkit, “an alternative to GPUs,” a tool you “right-size with the right question.”

Biercuk returns to sensing, predicting that within five years, “when you get on an aeroplane, that aeroplane is going to have a resilient GPS backup based on quantum sensing,” making quantum technology “part of life-critical safety systems”.

The quantum transformation is already on its way. In navigation systems. In battery research. In financial infrastructure planning. It’s just that most people haven’t noticed it yet, and by the time they do, it will be just another part of how the world works.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

UNIT N, 17/F, CENTURY INDUSTRIAL CENTRE, 33-35 AU PUI WAN STREET, FOTAN, SHATIN, Hong Kong