Quantum-aided drug discovery

Quantum Computing
  • Insight
  • 10 minute read
  • 05/11/25
Richard Polifka

Richard Polifka

Manager Technology & Data, PwC Switzerland

Discovery of new chemical substances and drugs is known to be a long and costly process, for which the average costs are estimated to be up to around $2.5 billion. In the recent years, the computer aided drug discovery has made the initial phase more efficient by designing the chemical compound first in the digital world before chemists in the lab would try to synthesise it. However, with the complexity of the substances, the requirements on the computing infrastructure are growing as well. Quantum computing bears the promise of being able to perform certain types of tasks significantly more efficiently than classical computing.

In the following article, researchers from PwC Switzerland, a Swiss quantum hub and academic institutions investigate, whether simulation of new drugs might be one of such tasks.

“Quantum computing is paving the way for more efficient and creative drug discovery, outperforming classical methods in generating novel compounds.”

Prafull SharmaPartner, Technology & Data, PwC Switzerland

The classical approach to computational drug discovery in this analysis is by using Generative Adversarial Networks (GAN) combined with Variational Autoencoders (VAE). The Autoencoder is taking care of encoding the chemical compounds into abstract vectors of real numbers (called latent vector) still bearing the complexity but in a reduced space.

In the GAN model, two types of neural networks are competing: A Generator is trying to produce new data that look real, while a Discriminator is evaluating whether the given piece of data is real (coming from the training set) or fake, made up by the generator.

Since the two processes have different objectives, the training follows an adversarial procedure (a min-max win procedure) which depends on a “loss function” - a metric which defines, whether it is more important to create fakes indistinguishable from reality or whether certain deviations are allowed. The latter is the case for the drug discovery task, since the main purpose is to generate new chemical compounds. Once the training is done, the Generator produces sets of new chemicals for which typical characteristics are estimated. An example of such a typical metric is then how many viable new compounds are generated in comparison to the original training dataset.

Quantum Computing is an emerging technology, operating with qubits instead of classical bits which obey the laws of Quantum Mechanics. Unlike bits which can be either zero or one, qubits can be in a state which is a combination of both. Using the effects of superposition, interference and entanglement is what predestines Quantum Computing to be more efficient at certain class of tasks. Generation of systems which have intrinsic dependencies is one of such tasks and it is exactly what is happening in the heart of the GAN Generator. In this work, the Generator has been replaced by a series of quantum circuits (see Figure 1) in a scheme called Style-Based Quantum GAN and the inference of the pipeline has been tested on an IBM Heron Quantum Computer.

Our analysis

The analysis was performed in three steps. In the first one, training of the VAE was performed on 12’000 chemical compounds and the settings of internal learning parameters was optimised. The goal was to create a pipeline which is able to encode complex chemical structures from the dataset into the latent space of dimension 10 and then performs the decoding to obtain the initial set of molecules. Once an optimised VAE training was obtained, it was used in the GAN scheme.

quantum computing drug discovery

Figure 1 - The Styled-based Latent Quantum GAN Schema

The training of the GAN and optimisation of its learning parameters was performed in the classical setting where the Generator consisted of a Neural Network (NN) with 5 fully connected layers and 705’162 internal trainable parameters. The outputs of the Generator (synthetic data) and the VAE Encoder (train data) in the form of the latent vectors were processed by the Discriminator (3-layer NN) which was trying to distinguish the original and generated data. 

Once the learning parameters of the GAN were estimated, the Classical Generator was exchanged by Quantum Circuits and trained on a noiseless Quantum Simulator. The circuits implemented two layers of gate operations on 5 or 10 qubits. In the final step, a set of molecules was generated with the Quantum Generator running on a real Noisy Intermediate State Quantum (NISQ) Computer. In this analysis, for the final comparison, 2’500 generated molecules were compared. Resulting molecular sets were analysed with a dedicated tool for computational chemistry to evaluate chemical validity and compatibility with drug design. 

The results

quantum computing drug discovery

Comparison of the distribution of the Internal Diversity of the molecular sets between the train set and four sets generated with the VAE with different training settings. 

quantum computing drug discovery

Comparison of the distribution of the likeliness that a molecule is a viable drug candidate between the test and train samples and various QGANs implementing different circuits.

quantum computing drug discovery

Example molecules generated by the IBM Heron Quantum Computer and decoded with the VAE.

The images above present the results. The first plot shows the very good capacity of the VAE training reproducing the training set, meaning that the VAE Decoder can be used for reconstructing the novel molecules generated by the GANs. The middle plot compares the ability of the classical and quantum GANs to reproduce the training and test distributions. While the classical GAN tends to reproduce the shape of the training sample, the quantum GANs show similarity to the test sample - a sample that they were not trained on - meaning that they are better able to generalise than the classical GAN. The last image shows an example subset of molecules generated by the IBM Heron Computer. Three sets of 2’500 molecules were generated with the classical, noise-free and real quantum Generators. The results show comparable values in several metrics, which on its own is a success for a NISQ computer.  Additionally, the Quantum Generator has a significantly simpler inner structure (~6’400 times less internal trainable parameters), hence dramatically reducing the complexity of the model.

First step to quantum-aided drug discovery

In conclusion, this work presents a first step on a road to quantum-aided drug discovery, showing its potential already at this initial small scale. These preliminary promising results have been presented at the first International Conference on Applied Quantum Methods in Computational Science and Engineering (AQMCSE) in Aachen, on October 8th, 2025 [1]

FAQ about Quantum Computing

At the beginning of the 1980s, Richard Feynman, an American theoretical physicist, presented the possible advantages of computing based on the laws of quantum mechanics in his famous lectures. To quote him: “Nature isn’t classical, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and it’s a wonderful problem, because it doesn’t look so easy.” Indeed, the problem turned out to be extraordinarily challenging.  

With Peter Shor’s development in 1994 of an algorithm that can efficiently find the factors of (very) large numbers, the interest in QC rose. Shor’s algorithm, once performed on sufficiently advanced hardware, promises to break the most complex cryptographic algorithms that are based on prime factorisation and thus endanger many of our activities in daily life (such as banking).

It took more than 20 years before the team of scientists in 2019 could claim the ‘quantum supremacy’ of their Quantum Computer, which performs certain operations in seconds while the same calculations would take a classic supercomputer 10,000 years. As the race to build such a machine involves fierce competition, claiming the supremacy title was immediately attacked by other players in the field. Nevertheless, it showed that commercially produced quantum computers are far beyond their conceptual phase and the once unreachable fault-tolerant universal quantum computer capable of breaking RSA encryption might not be so far in the future.

QC is built around the concept of a qubit – a quantum analogue of a classical computational bit. While a classical bit can have only one of the two values at any time – namely 0 or 1 – the qubit, obeying the laws of quantum mechanics, can benefit from the effect of superposition – a situation where it is a mixture of both states of 0 and 1. Superposition, together with entanglement and interference, fundamentally changes the way calculation is performed, leading to exponential acceleration of the speed at which certain problems are solved. Similar to classical computation, the desired result is obtained by developing and executing algorithms. In the quantum case, this involves defining the initial state of the system and performing operations (applying ‘gates’) to bring the system to the desired configuration when it can be measured. The key performance indicators of such a process are then driven by the number of qubits, the time a system can stay in a certain configuration (called coherence) and the quality of the gates (called gate fidelity).

One of the specifics of the quantum industry is the fact that potentially many roads lead to the same destination. This fact is stimulating healthy competition between different conceptual approaches and quantum technologies. Here are the most advanced ones:

  • Superconducting QC. The qubits are realised by electrical circuits cooled down to almost absolute zero (<0.015 Kelvin) where currents flow without almost any resistance. Such a configuration then exhibits quantum behaviour and these qubits are then modified by using electromagnetic pulses in the radio-frequency spectrum. The advantage of this technology is the overall advancement and the speed of the gates.
  • Trapped ions. Qubits consist of ions (atoms with electron(s) missing from their orbitals) that are electrically charged and therefore can be trapped and manipulated by electromagnetic fields. The advantage is the high connectivity of individual qubits and a long de-coherence time.
  • Photonics. Qubits in photonics are the photons themselves with beam splitters acting as gates. The advantage of this technology is that it can operate at room temperature. 
  • Neutral atoms. The qubits are atoms suspended in an ultra-high vacuum manipulated by tightly focused laser beams called optical tweezers. Thanks to the neutral nature of the qubits, this technology is less sensitive to electric fields which could affect the system.
  • Topological QC. Topological qubits are based on quasiparticles, that means the building blocks are not individual particles, but rather collective characteristics of carefully fabricated quantum systems. They are realized with semiconductor nanowires with presence of so-called “Majorana zero modes” – a state of matter which is supposed to make the qubits significantly more resistant to noise. Recently such quantum chip was introduced and a roadmap up to commercially viable computers with millions of qubits presented.

All the technologies described above are still relatively young and can’t deliver to the user a sufficiently high number of logical qubits that would allow for industry applications to demonstrate commercial quantum advantage. This would be achieved when a quantum computer manages to solve a real-life business problem that currently can’t be solved with classical computers. One example could be a large-scale real-time portfolio optimisation or a travelling salesman problem (TSP) with thousands of cities to visit. Progress is partially slowed down by the fact that, as the quantum systems grow (number of qubits), it’s less and less easy to keep them together in the desired state. Even a small amount of noise can negatively affect the performance of the algorithms. A highly active field of study is the topic of quantum error correction where several ways to mitigate noise are being implemented. It means that more than one physical qubit must be used to create an algorithmic or logical qubit, which are the building blocks of a functioning algorithm. The implication is a large overhead of physical qubits that need to be built, for the most complex systems up to a factor of 1,000, labelling this stage of QC development as the noisy intermediate state quantum (NISQ) era. This indicates that a fault-tolerant quantum computer (FTQC) is still at least a decade away. Even though the FTQC still isn’t there yet, there’s a lot of research in the QC application domain demonstrating that even NISQ quantum computers can potentially bring some benefits.

A special class on their own are quantum annealers. While the above technologies provide us with ‘digital’ QC, quantum annealers are the ‘analogue’ quantum computers. They’re based on superconducting technology and use the adiabatic theorem. This states that a quantum system can stay in its state if a change which is acting upon the system is sufficiently slow and the energy necessary to cause a transition into an excited state is sufficiently large. In such a case, the system can be prepared in a well-defined state and allowed to evolve into the final state which characterises the problem that needs to be solved. 

[1] J. Baglio (U. Basel & QuantumBasel), Y. Haddad (U. Bern & CERN), and R. Polifka (2025). Latent-Style-Based Generative Quantum Model for Assisted Drug Discovery. Research Highlights at the first International Conference on Applied Quantum Methods in Computational Science and Engineering (AQMCSE 2025, Aachen, Germany, October 8th.

Our insights. Your choices.

Subscribe to PwC insights

Contact us

Prafull Sharma

Partner Technology Strategy & Transformation and Technology & Data FS Leader, PwC Switzerland

+41 58 792 18 72

Email