Drug discovery and development is a long, costly, and high-risk process that takes over 10 to 15 years with an average cost of over $1 to $2 billion for each new drug to be approved for clinical use.
For any pharmaceutical company or academic institution, it is a big achievement to advance a drug candidate to phase 1 clinical trials after drug candidates are rigorously optimized at the preclinical stage. However, nine out of ten drug candidates would fail during phase 1, 2, 3 clinical trials and drug approval.
It is also worth noting that the 90% failure rate is for the drug candidates that are already advanced to phase 1 clinical trials, which does not include the drug candidates in the preclinical stages. If drug candidates in the preclinical stage are also counted, the failure rate of drug discovery/development is even higher than 90%. The concept we use for this low efficiency in drug development is called Eroom´s Law.
Why does drug discovery and development fail?
In chapter 21 of Artificial intelligence and machine learning in clinical trial design and application, Robert J. Beetel and fellow authors explained: “There are many explanations given for why drug discovery has followed Eroom’s law, from cautious regulators to increasing overall R&D costs. But one of the biggest areas holding back progress is inefficiency in the drug discovery phase, the preclinical animal testing and clinical trial phase of the drug-discovery process.”
What percentage of these compounds contain drug-like properties that may be utilized to produce life-saving drugs? Millions? Billions? Trillions? The answer is 1060 novemdecillion. To put this into context, the Milky Way contains around 100 billion, or 1011, stars.
From phase 1 to phase 3, the typical drug development process depended on serendipity, experimentation, optimization, animal models, and evaluating the new medicine in people. Various technologies contested the key in an attempt to disrupt this process. This article will attempt to demonstrate why three major technologies currently being developed and tested have the potential to change the curse of drug development: artificial intelligence (AI) for drug discovery, organ-on-a-chip (OOC) for testing without animal models, and digital twins for clinical trials.
This is not science fiction, but rather a plausible evolution in drug discovery that might become the norm. Let’s get granular on how these new technologies can disrupt drug discovery.
Drug discovery evolution: the use of generative AI
Drug discovery is an intricate, trial-and-error process. But then, AI intervened, revolutionizing the landscape.
AI, showcased by platforms like DALL-E 2 and Midjourney, transitioned from generating surreal images to redefining drug development. It is infiltrating every stage of the process, from identification to compound design and predictions. Nvidia’s BioNemo emerged as a comprehensive tool, amalgamating diverse AI-driven functions like 3D protein structure prediction, protein property predictions, small molecule generation or protein generation, and molecular docking.
Nevertheless, there are many companies with proprietary platforms and databases working on their own pipeline or partnering with pharma, just to mention a few: Insitro, Insilico Medicine, AtomWise, Healx, Verseon and Qubit (Quantum technology for drug discovery).
The revelation of AI’s potential came with a BCG and Wellcome report, projecting significant time and cost savings by about 25–50%. Subsequently, success stories emerged: Exscientia led the way, producing groundbreaking molecules for clinical trials with AI’s aid (A2A receptor antagonist for solid tumors and a selective serotonin reuptake inhibitor for obsessive-compulsive disorder). Insilico Medicine followed suit, launching a phase 1 trial for an AI-designed senolytic drug targeting aging-related diseases.
And many other examples followed, like Relay Therapeutics with its RLAY-4008 for cholangiocarcinoma, Nimbus Therapeutics with NDI-034858 for moderate to severe psoriasis, and Evaxion with EVX-01 for melanoma. In these 3 cases, the use of AI decreased the time between discovery to clinical trials.
AI’s impact is seismic, blurring the lines between human ingenuity and machine precision. It isn’t merely about molecules but the fusion of human and machine intelligence, charting a new course in pharmaceutical discovery.
AI isn´t just disrupting the industry; it is catalyzing evolution in drug discovery, paving the way for a future where technology and human endeavor coalesce, propelling science toward innovative breakthroughs and healing. Think about making a 4-year process in about a couple of weeks. It’s mind-blowing to think how far we can get if you think about vertical integration between startups (experts on innovation) and big pharmaceutical companies (experts in clinical trial, regulatory, and commercialization) joining forces.
Pre-clinical testing: the use of organ-on-a-chip
For years, the pharmaceutical and biotechnology industries had navigated the arduous path of drug development, investing colossal sums to shepherd a compound from the laboratory bench to the FDA’s approval threshold. Yet, a recurring thorn plagued their journey: the inadequacies of animal testing in predicting human responses to drugs.
Countless instances of promising compounds, having navigated preclinical development via studies in small animals or non-human primates, ultimately failed in human trials due to either inefficacy or triggering perilous toxicities (for example, tuberculosis MV85a, HIV-1 DNA/rAd5, hepatitis C vaccines), or induced potentially life-threatening toxicities (for example, Hu5c8 monoclonal antibodies), thus causing cessation of human trials
Enter the organ-on-a-chip (OOC), a miniature marvel designed to emulate human organ and tissue structures using microfluidic devices. These chips are a technological symphony, nurturing living cells in tiny chambers interconnected by microchannels, mimicking the intricate functions and responses of human organs. Blood flow, inflammation, infection – these chips replicate the natural orchestration of physiological mechanisms, offering a promising alternative to the limitations of animal models.
The significance of this groundbreaking innovation is profound. It isn’t just about mimicking human physiology; it’s about the tangible impact it could wield in the world of medicine. OOCs promise a seismic shift in drug development. They bestowed higher physiological relevance, reduced reliance on animal testing, produced faster results, and provided an avenue for conducting more complex experiments by co-culturing different cell types or organs.
Imagine a liver chip boasting an 87% sensitivity and 100% specificity, far surpassing the capabilities of animal models (close to 47%). This isn´t just an incremental leap; it is a quantum leap in drug testing accuracy. The implications are profound and far-reaching.
Lives lost due to toxic drug reactions could be saved if these chips could be employed in the preclinical screening systematically, as stated in the FDA Modernization Act 2.0 from 2023. The chips held the potential not just to predict toxicity but also tailor drug testing for individual variations. Lung, liver, kidney, heart, brain – each could now be replicated on a chip as a solo organ or in synergy building the body on a chip, which will help predict the safety and efficacy of potential therapeutics, and develop personalized treatment regimens, while also helping to achieve the 3 R’s (Replacement, Reduction, and Refinement) in animal research.
Clinical trials: the use of digital twins
For years, the final stages of drug development had been the battleground for scientific endeavors, the arena where the promise of a new remedy clashed with the harsh realities of uncertainties, variabilities, and repeated failures. The statistics whispered a grim truth – only a fraction, a mere 13.8%, of drug candidates entering phase 1 trials eventually found the coveted approval from regulatory bodies. The financial stakes were exorbitant, with the average cost of conducting a phase 3 trial scaling to a staggering $255 million, contingent upon the diseases and outcomes measured.
Yet, during this storm of uncertainties, emerged the digital twins – ethereal replicas of the physical entities or systems, created from the mosaic of data sourced from an array of repositories. Sensors, devices, wearables, electronic health records, genomics, proteomics, metabolomics, and microbiomics; all threads in the tapestry woven to form these digital avatars. Moreover, intertwined within their essence lay the intricate web of AI models – machine learning, deep learning, and reinforcement learning – imbuing them with the ability to learn from the vast troves of data, to predict, simulate, and optimize behavior and performance under a myriad of conditions.
The sanctum of clinical trials found itself metamorphosed by the advent of these digital twins, offering a plethora of boons that reverberated through the hallowed halls of research. Patient recruitment, a perennial hurdle, will find an ally in the digital twin technology. The ability to simulate patient profiles, based on diverse data sources, significantly enhanced the process of identifying and enrolling suitable candidates.
Furthermore, these digital doppelgangers will contribute to refining trial designs, streamlining execution, and ultimately truncating the duration and cost, while elevating the success rates and overall quality of the trials.
But the impact isn’t solely economic or logistical. It extends far beyond, touching upon the ethical and human dimensions of clinical trials. Digital twins, with their capacity to replicate patient profiles virtually, can bring in a new era of personalized medicine. They are able to negate the necessity for in-person placebo groups, invasive procedures, or events causing harm, thus fostering trials that are more ethical and patient-centric.
In February this year, a team from the Karolinska Institutet showed a new process of using digital twins to predict responses of patients afflicted with rheumatoid arthritis to biological drugs, weaving insights from clinical and biomarker data.
Other examples are optimizing dosages of medications like warfarin for patients with atrial fibrillation, and leveraging the amalgamation of pharmacogenetic and clinical data. And then, using adaptive trials for those grappling with the formidable adversary of glioblastoma, designed with the aid of digital twins, entwining imaging and molecular data in a dance of precision and hope.
The landscape of drug discovery and development is undergoing a remarkable evolution, driven by a convergence of innovative technologies that promise to redefine the trajectory of medical science.
The daunting challenges, long timelines, exorbitant costs, and high failure rates historically associated with drug development are being confronted by cutting-edge advancements in AI, OOC, and the utilization of digital twins. As we embrace these advancements, the possibility of evolution, transforming what once took years into a matter of weeks, is a testament to the astonishing potential for progress in the field of drug discovery and healthcare as a whole.