The quantum computing revolution - part II: the path to commercialisation
This is the second part of our deep dive into Quantum Computing (QC). In this article, we focus on the anticipated applications, provide an analysis of the activity in the early stage ecosystem in the UK, and discuss key investment considerations. For an accessible overview of the technology and the key challenges engineers face, please read the first part of the series: The quantum computing revolution, part I: the theory in a nutshell.
The NISQ era of quantum devices offers few commercial applications
As discussed in our first article, QC offers a new paradigm of exponentially higher computing power compared to what’s available today. Such capability promises new use cases and applications with implications for most aspects of our lives. However, the current state-of-the-art machines are too slow and too noisy (high error rates) enough to provide the anticipated commercial applications.
We distinguish between three categories of QC devices:
- Quantum Annealers (QAs) — in their purest form, annealers are a form of analog technology. QAs do not strive to entangle all qubits in a system but only neighbouring ones, they don’t employ error correction mechanisms and cannot produce a set of parallel computations as gate models do. Instead, a problem is encoded in some initial energy state of the qubits of a quantum system, placed in a state of superposition. Then the energy state of the system is manipulated until a ground state (i.e. lowest energy state) is achieved. This corresponds to the solution. When the process is performed very slowly with minimal dissipation of energy it is referred to as adiabatic annealing. QAs are especially effective when solving optimisation problems. Theoretically, under a lot of strict assumptions, adiabatic QAs can obtain similar results to gate-based computers, but they do not have mechanisms for error correction. When these assumptions are relaxed, evidence is still lacking on whether QAs are superior to even classical computers.
- Noisy intermediate-scale quantum (NISQ) gate computer — this device utilises a gate-based model to perform operations (similar to classical computes) but does not have the capability to suppress all errors. Decoherence and errors impair NISQ’s ability to perform sophisticated calculations that require logical depth (number of operations that can be performed sequentially). Often the software is specifically designed and optimised for particular hardware implementation. These devices represent an intermediate step on the path to fully error-corrected gate-based computers and feature in the roadmap of most QC startups. As discussed in Part 1, error correction is still ‘expensive’ and difficult to scale, but scientists have developed error mitigation techniques that enhance computational capability without requiring additional qubit resources.
- Fully error-corrected gate-based computers — these gate-based devices will be capable of correcting errors within the timeframe of a calculation, allowing for reliable, near error-free operations and much greater logic depth. With error-correcting mechanisms, gate models will deliver exponentially more computational power than classical machines, run any type of algorithm, and ultimately become universal with a large variety of applications.
At present, most functional prototypes fall in the NISQ category. Early-stage companies building QCs usually target NISQ hardware as the first commercial product on a 3–5 year timeframe.
Quantum annealers have been successfully commercialised but are still mostly used in experiments and research
Although practitioners still disagree on whether this technology could be referred to as QC, annealing seems to have the most potential to find real life application in the short-term (in the next 5 years). Indeed, QAs are most suited to solving optimisation problems and have already been deployed commercially in certain industries. There are several companies working with annealing technologies, but perhaps the most significant player in the field is D-Wave Systems, also one of the pioneers in the commercial use of QC. Since its inception in 1999, D-Wave has received over US$200m in funding and has produced 4 generations of QAs (Gen 4 reportedly sold for $15m) with their client list including customers such as Google, Temporal Defence Systems and Lockheed Martin. On its webpage (link), D-Wave lists several applications of QA technology:
There is no conclusive evidence yet that QAs are superior to classical computers. Most commercial use is mostly experimental in nature. However, with more and more experience and hardware advancement, there are already promising signs that annealing is delivering some benefits over current technologies — for example, see Quantum annealing versus classical machine learning applied to a simplified computational biology problem.
A universal quantum machine will deliver exponential improvement in capabilities
Universal QCs will not offer a mere improvement in computing capabilities, but a new paradigm, enabling us to tackle challenges thought to be insolvable with classical computers. There are three broad classes of problems that QCs should, theoretically, excel at:
- Factoring (cryptography) — Cybersecurity systems and protocols rely on encryption for secure communication of information (including password protection, secure web browsing, etc.). A large proportion of encryption techniques, including RSA (widely used in digital signatures) and cryptographic key exchange protocols (e.g. Transport Layer Security handshake, or TLS), are based on solving the discrete log problem or the problem of factoring large integers. These algorithms have been considered impractical to break because of the time and computing power it would take for a classical machine to do so (time is exponentially growing with the number of bits used in the encryption). However, the properties of QC significantly challenge these assumptions. In fact, the first quantum algorithm (proposed by Peter Shor) to demonstrate the practical potential of QC solved the problem of factoring large numbers and computing discrete logarithms. It achieved that exponentially faster than any algorithm developed for a classical device. For example, it would take the best-in-class modern classic machine 300 trillion years to break an RSA key with 2048 bits, while a QC with 4099 logical qubits (i.e. ones that could be used in calculations) could do so in 10s. Admittedly, we are some time away from engineering a machine with 4099 logical qubits (would require thousands more physical qubits given the state of the art error correction techniques available today). However, it is clear that QC poses a significant threat to existing security software infrastructure and this has put many in the cyber community on high alert to develop quantum-secure encryption algorithms.
Commercial examples: The US National Institute of Standards and Technology (NIST) has an ongoing competition to build quantum-resistant encryption algorithms. Companies such as Post-Quantum and PQShield are participants in NIST’s programme.
- Optimisation (including machine learning) — optimisation problems exist across a range of domains (transportation, finance, energy, logistics, engineering, computer science, etc.). The goal is to find the best solution of a set of possible solutions and usually involves minimising the error between a given solution and the correct solution. Classical computers need to sequentially optimise individual variables of the system before they can arrive at an answer. QCs can operate on multiple variables simultaneously, thus exponentially speeding up the calculation. QAs are particularly well suited to solving optimisation problems and have already been deployed commercially in industries such as finance, transport and supply chain. They can also be used to meaningfully improve machine learning capabilities.
Commercial examples: JPMorgan Chase wants to optimise stock trading strategies and financial risk analysis using QC. Delta Airlines is trying to use QC to optimise its flight schedule after massive disruptions like hurricanes and blizzards.
- Simulation — Simulating quantum systems is probably the most natural problem set for a QC. The general purpose here is to simulate the environment in which the system exists and the interactions with that environment. There are a variety of valuable applications in this area ranging from material science to medicine. Via simulations of chemical reactions, scientists are hoping to revolutionise the way new materials are created. Some of these simulations may require QCs with only a few hundred qubits and therefore could become one of the first real-life applications for gate model computers. D-Wave has already employed their QA hardware in a number of scenarios, proving that it is capable of doing simulations of electronic structures.
Commercial examples: D-Wave has already employed their QA hardware in a number of scenarios, proving that it is capable of doing simulations of electronic structures (see the table above). ProteinQure has partnered with QC manufacturers (IBM, Rigetti) and pharma companies (AstraZeneca) to explore approaches to modelling proteins.
Chasing Quantum Advantage
A highly sought after milestone in the race to developing the first QC is achieving “quantum advantage,” i.e. demonstrating that a QC can perform a task that is beyond the capability of any classical computer. This often gets mixed up with achieving ‘quantum supremacy’, which has a tighter definition of a QC outperforming a classical one for solving a particular problem. In October 2019, Google published a paper claiming it hit quantum supremacy — the company used its 53-qubit quantum processor to perform a random sampling calculation (verifying a set of numbers is randomly distributed) in 200s, a speed unattainable for a classical computer. Arguably, the problem that Google chose is of little practical significance. Google claimed it would have taken IBM’s Summit, the most powerful supercomputer currently, 10,000 years to arrive at the solution. IBM argued that using a specific algorithm designed algorithm Summit could solve the problem for 2.5 days. Ignoring the technicalities, while it cannot be classed as ‘quantum advantage’, Google’s achievement is definitely a step in the right direction.
That said, we should not lose sight of the big picture — we are still in the era of the NISQ machines that are error prone, low in computing power and difficult to control. Using a QC to solve the hard problems that will meaningfully impact science, engineering and business is still years away. Using an aviation analogy, Google’s milestone is comparable to Wright Brothers’ first flight, which took place in 1903. The first commercial flight was not until 1910.
Cloud is emerging as the preferred delivery mechanism
While QC technology is relatively nascent, the potential for unlocking powerful use cases is enticing enterprises and academics alike.
D-Wave was one of the first companies to sell QCs commercially. In 2010, Lockheed Martin became the first customer, purchasing a 128-qubit D-Wave One machine (installed at University of Southern California’s Information Science Institute site). Following that milestone, other organisations (including Google and NASA) have also purchased D-Wave’s hardware or access to it. While there is still no conclusive evidence that annealers are supreme to classical machines, D-Wave helps its customers design the algorithms to achieve cost, speed and quality of answer advantages for specific problems. This approach has certainly resonated with its client base, which has now signed contracts with a total value exceeding $75m. Recently, D-Wave launched its newest machine using 5,000 qubits.
D-Wave started off by selling hardware for on-premise installation, however, the company is increasingly moving to a cloud model, i.e. providing access to machines instead of ownership. The cheapest tier on their new Leap 2 offering provides 10 minutes of access to a quantum process for $335 (additional time is charged at $2000/hr).
Among the major tech providers Alibaba, Amazon, IBM, Honeywell and Microsoft have introduced cloud access and application development environments to work on top of their hardware. Despite being among the QC pioneers, Google is yet to provide cloud access to their quantum device. Rigetti Computing is another prominent QC vendor that offers access over the cloud.
Cloud access to compute has well-understood benefits for the consumer: reduced capex, lower maintenance costs, greater scalability and flexibility. These are even more important for QC due to the highly sophisticated hardware involved and the specialised environment required. For the operators, cloud delivery allows more control over the hardware and greater flexibility when it comes to upgrading, introducing new software, languages and applications. Most startups also plan to use the cloud as the default access mechanism and commercial model. That said, the convenience and lower cost of cloud access have to be balanced with security considerations. Given the strategic importance of QC, it is likely that some companies would prefer an on-premise installation. But, at this point, it is difficult to see QCs being mass produced for home or professional environments.
The future is quantum… start now
Theorists in QC expect future machines to represent a step change in electronics, not too dissimilar to the progress made possible by the invention of the classical computer. Given the engineering challenges scientists face, a mature universal QC is at least 10 years away. Therefore, current efforts naturally focus more on the technology rather than thoroughly exploring all the possible applications. Nevertheless, experts envisage a revolution in our understanding of important science and engineering fields such as machine learning, materials, chemistry, medicine, meteorology, cyber security, code breaking.
Governments and companies should prepare for the potential paradigm shift because the implications will be profound. There are at least four axes for leaders to consider:
- Quantum-proof security — a mature QC capability might render the present encryption techniques obsolete. Shifting to a quantum-secure architecture, by relying on technologies that harness quantum mechanics, is already becoming a priority for security leaders. The time required to acquire the new expertise and tooling should not be underestimated — the latest report on cyber security risks by the World Economic Forum further emphasises this point.
- Impact of new use cases and applications — as with any new enabling technology, the ‘killer app’ is difficult to imagine today. However, most of the theoretical research points to significant disruption across multiple industries. For example, advances in drug design, faster drug manufacturing, the invention of new materials, or the application of more powerful AI are all types of use cases that can materially shift industry structures. This explains why most governments think of the development of quantum technologies as a matter of strategic importance and a national priority.
- New skills required — in many industries, maintaining a competitive edge will require harnessing the power of quantum systems. Organisations that want to be successful in navigating the hype and identifying business-relevant opportunities will need to develop QC competencies. An interdisciplinary skillset will be important in the early days both for assessing the business impact and selecting the relevant technology stack. Fortunately, with many cloud offerings up and running (see above), some of the tools needed to start experimenting are already present.
- Regulation and responsible design — QC is a technology with significant disruptive potential. Therefore, who gets to develop, use and benefit from it matters. We are still in the early days of development and this presents an opportunity to embed ethical and responsible design principles in our approach to the technology.
QCs with 50–70 logical qubits would be only marginally superior to the most powerful classical supercomputers and this would be true for a relatively limited subset of problems. For example, in material science/chemistry, such quantum machines would simulate relatively simple structures, while the goal is to be able to examine complex reactions in order to find new, more efficient methods to produce ammonia, or to simulate the reactions occurring in the first moments of the universe after the Big Bang. The highly anticipated potential applications discussed above (e.g. code breaking or drug design) require thousands if not millions of logical qubits, a feat, which is clearly beyond any current engineering capability. Therefore, finding applications, which are going to fill the gap between the computers that we have today and the universal quantum machines of the future is crucial to keep investment flowing in.
UK is a global leader in quantum technologies
UK is one of the leaders in quantum technologies, ranking fourth by R&D budget globally.
The UK National Quantum Computing Programme is organised in four hubs, each focussing on a different set of quantum technologies:
- The Sensors and Timing Hub aims to develop a range of quantum sensor and measurement technologies
- The Quantum Imaging Hub works on new types of ultra-high sensitivity cameras such as single-photon visible and infrared cameras, single-pixel cameras, extreme time-resolution imaging, 3D profiling, imaging beyond line-of-sight, and imaging of local gravity fields.
- The Quantum Computing & Simulation Hub focuses on accelerating progress within QC.
- The Quantum Communications Hub focuses on the research, development and commercialisation of quantum secure communications.
The government provided £120m of funding to these hubs over a five year period (2014–2019). In 2019, the UK government launched a second wave of government support worth £94m. As part of this second phase, a National Quantum Computing Centre was also established. This is a dedicated national centre with £93m of funding covering four key streams:
- 100+ qubit NISQ demonstrator hardware platform(s)
- Quantum software, algorithm & applications development
- High performance, scalable qubit technology development
- Roadmap and architecture towards fault-tolerant general purpose quantum computing.
The robust academic activity and government support are supporting the emergence of a vibrant early-stage ecosystem.
Our research identified 31 UK startups and scale-ups focusing on quantum technologies:
- Nearly half of those are targeting QC with eight focusing on hardware and seven on software. Examples include Cambridge Quantum Computing (software), Phasecraft (software), Orca Computing (hardware), Oxford Quantum Circuits (hardware).
- More than a third of entrepreneurs are developing new techniques to improve the existing communications and introduce new quantum-proof security protocols. Examples include AegiQ, Crypto Quantique, KETS Quantum Security, PQShield.
- The sensors and imaging category seems somewhat under-represented in the early-stage ecosystem. Examples include Artemis Analytical, M Squared, Raycal.
Looking at activity from a funding perspective does not change the dynamics materially. QC startups have attracted a higher share of funding, indicating 1) higher capital intensity and/or 2) more mature businesses (e.g. Cambridge Quantum Computing accounts for a significant share of the software category).
Globally, the share of companies focusing on QC is significantly higher than those targeting other quantum technologies. In the UK, the activity is much more balanced, possibly due to the existence of the four large national hubs (see above) that are driving interest in their respective areas.
Since 2012, investment into UK quantum startups has reached £110m with £24m being delivered in the form of public grants and £86m coming from the private sector, i.e Venture Capital (VC). UK’s share of global private investment ($1.5bn) stands at 7%, which is broadly consistent with its 6% share of global R&D budgets.
VC funding of UK quantum startups has been volatile. After a peak in 2016, 2017 and 2018 saw a downward trend, followed by a sharp recovery in 2019. 2020 has seen robust funding activity so far with investment over the first nine months almost matching last year’s amount.
The vast majority of funding in the last couple of years has gone into startups developing QC hardware and software. This is in line with the global trend and underlines an increased investor appetite to gain exposure to emerging technology. Equally, the quantum of funding raised by some of the mature hardware projects highlights the high capital requirements related to supporting R&D efforts in the space. More than half of the total global private investment in quantum technologies has gone into just four companies: PsiQuantum, D-Wave, Rigetti, and IonQ. All of these focus on developing QC hardware.
We expect the upward trend in investment to continue due 1) technology maturing 2) wider availability of public, non-dilutive grants, and 3) news flow related to important milestones as hype intensifies (e.g. Google’s announcement of achieving quantum supremacy). Indeed, Google Trends data shows that QC achieved peak global popularity when Google announced it had achieved quantum supremacy.
As QC continues to evolve from theoretical to practical applications, investor interest will only grow. However, for the time being, QC remains a very specialised, possibly niche pursuit for deep tech investors who possess the necessary long-term horizon, ability to invest across cycles, and above-average risk tolerance. Below, we outline some of the key aspects to consider when approaching opportunities in the space (this is, by no means, exhaustive).
Hardware projects tend to be very capital intensive
It is no surprise that investors are lured by the opportunities offered by QC — the first company to develop a truly scalable configuration with effective error correction mechanisms will generate an outsized return. However, investing in hardware comes with a few important caveats:
- Firstly, the most effective approach to scaling a QC while keeping error rates manageable has not been established yet. Therefore, any hardware project comes with significant scientific, engineering, and execution risks. One of the most fundamental choices QC hardware investors face is regarding qubit technology. The majority of global VC funding has gone towards photonic and superconducting qubits, according to data compiled by Le Lab Quantique. Yet, a survey of 22 experts QC experts conducted by Global Risk Institute and evolutionQ Inc. pointed to superconducting and trapped-ion qubits as having the highest potential to realise a 100-qubit machine within the next 15 years. Photonic qubits ranked fourth, only ahead of topological qubits (still largely theoretical). This suggests that speculative approaches could attract a significant amount of investment.
- Second, most startups target building a NISQ machine as an intermediate milestone to developing a universal computer. However, NISQ machines are likely to be of limited commercial value. Thus, a project is likely to be non-revenue generating for the first 5 years and require continuous cash injections (D-Wave has raised more than $200m to launch its commercial offering). Such capital intensity makes a QC hardware investment akin to backing a biotech business.
- Third, benchmarking between projects is not easy, especially given the number of different approaches that are religiously defended by academics. It is tempting to look at qubit numbers as a heuristic for progress made or the likelihood of success. When evaluating quantum hardware opportunities, beware of claims about high qubit numbers — without error correction, even a machine with a large number of qubits may not have enough ‘usable’ qubits due to the redundancy required.
Software is challenging to commercialise on its own
Some of these risks exist in any hardware investment, not just QC. Therefore, software might seem like a more appealing way to get exposure to the space. Indeed, VCs are comfortable assessing software companies and, usually, prefer them for their lower operational complexity, lower capital intensity, and faster route to market. However, QC software startups come with an important caveat — the hardware needed to run the software is not yet fully developed. Hence, abstracting away the complexities of a particular hardware configuration while developing the software is always viable. This might require a software company to form a close partnership with a hardware manufacturer and give up some of the economics. In other words, without enough abstraction between hardware and software, the benefits of investing in a software product are not clear. In fact, it might entail greater risk and take longer to commercialise.
One exception here is companies that provide software to simulate a QC using a classical machine. Such simulators are useful when developing or testing quantum algorithms and should be easier to commercialise, although open-source solutions already exist.
Teams balancing academic excellence with commercial drive are better positioned
Despite the progress made over the past decades and the growing number of private companies getting involved, QC continues to require intense academic research. The overwhelming number of early stage businesses in the space are university spin-outs. While academic credentials are essential in the early stages of the project, the successful commercialisation of the technology will likely require a pragmatic approach with quick iteration. Therefore, founding teams that combine academic heritage and commercial experience are better positioned in the long-run.
Quantum computing offers a new paradigm in information technologies, promising to be no less disruptive than the invention of the computer itself. A universal quantum computer may still be years away but the technology has started its transition from academia to commercial applications. As this transition gathers pace, investor and entrepreneurial interest will only grow. We are excited to be part of the journey.
We’d like to thank Andrii Iamshanov (AegiQ Advisor), Christophe Jurczak (Founder and Partner at Quantonation), Ilana Wisby (CEO of Oxford Quantum Circuits), Max Sich (CEO of AegiQ), Namratha Kothapalli(Senior Associate at Speedinvest) and Richard Murray (CEO of Orca Computing), whose feedback and guidance contributed to the above work.