AI Milestones: FDA’s ISTAND program accepts AI-based assessment tool for depression

In this series, Parexel’s Chief Data Officer Stephen Pyke considers the potential and challenges of artificial intelligence as AI comes of age in drug development. His conversations with research experts, in disciplines ranging from early development to clinical trial design, shed light on state-of-the-art applications and on AI’s dazzling possibilities to advance novel medicine, improve development efficiencies and continually enhance patient outcomes.   

We passed a significant milestone in January when FDA announced that the first AI-based tool—Deliberate AI’s depression and anxiety model, AI-COA™—had been accepted for evaluation into ISTAND, FDA’s program to evaluate innovative technologies intended for use in clinical trials.1 This begins a three-step process toward qualification as a regulatory-accepted tool for the assessment of depression severity in clinical research. Chief Data Officer Stephen Pyke talked with neurology expert Andreas Lysandropoulos and Chief Patient Officer Stacy Hurt about expectations for AI-COA and its potential impact on depression research and treatment.  

Stephen Pyke: According to Deliberate AI, their AI-based clinical outcomes assessment (COA) tool uses a machine learning algorithm to assess multimodal behavioral and physiological indicators of depression. These are signals like eye movement, facial expression, voice patterns and vital signs.2 Until now, we’ve relied on questionnaires, primarily the HAM-D (Hamilton Depression Rating Scale) and HAM-A (Hamilton Anxiety Rating Scale), to measure patients’ levels of sadness, lethargy, sleep disruption and so on, and obviously, these are both subjective measures. 

Stacy Hurt: So we’ll have the patient voice through the traditional COA assessment, and then we’ll also have the object biometric measurements through sensors and automatic facial expression analysis.  

Stephen Pyke: That’s right. Deliberate’s AI-COA model promises to be an objective measure—free of bias—based on physiological indicators. Objective measurement of symptoms will be a major step forward.

Andreas Lysandropoulos: Certainly, a major advance that sends a clear signal from FDA, not just for depression, but more generally for neurology and psychiatry conditions. Here, the main challenge is the lack of objective assessments of disease status and therapeutic effect. In these conditions, unmet need remains high after a long period of relatively limited progress.  Multimodal AI tools like this, if validated, will lead to better clinical trials in depression—more sensitive and more reliable, faster and less expensive. The key in the validation process of an AI tool is to show a positive correlation with the traditional COAs.

Stephen Pyke: From a data perspective, it’s great to watch this technology unfold. FDA launched ISTAND in 2020 to create a pathway for technologies seeking qualification. This is just the first of many AI-based models in the pipeline. We don’t have details yet, but once it’s qualified, I foresee that AI-COA will be used in investigational sites for data capture and recording. AI will crunch the massive amounts of signal information into meaningful data points for use in phase II and III studies.  

Andreas Lysandropoulos: Such AI tools can revolutionize how we run psychiatric trials. The COAs we use now are time-consuming burden on both patients and researchers. AI-based COAs will likely be faster and yield more reliable data. In patient recruitment, for example, AI-COA could be used to pre-screen patients and reduce screening failure.

Stephen Pyke: Of course, we can expect challenges. One will be the volume of data AI-based tools generate, and there will be issues surrounding data governance, a lot for FDA to sort out. Another challenge will be patient privacy—visual data capture is quite intrusive, and data privacy protections will have to be put in place. 

Stacy Hurt: AI poses another layer that we have to consider. It’s not just about what information we’re collecting and where that information is going. It’s also about educating the patient about what we’re doing with AI. You know, “AI” is still something of a scary term—suggestions of Big Brother watching.   

Stephen Pyke: So we’re looking ahead to the benefits, but AI could also be a negative experience for patients. How do you think people will respond to this kind of data capture in a clinical trial?

Stacy Hurt: A lot will depend on helping patients understand why we’re collecting these measurements—explaining that we used to measure their symptoms this way, and now we can add findings from this more objective way using AI. The message isn’t just about how it benefits the research process. It’s about how well researchers explain how it benefits the patient.

Stephen Pyke: You’re suggesting that could make a difference in whether patients comply. 

Stacy Hurt: Mental disorders are still highly stigmatic. People don’t want to talk about what they’re feeling. In terms of patient benefits, maybe AI could be used to recognize patterns and identify episodes of severe depression that lead to bad outcomes—hospitalizations, acute event, and so on. It will take time and effort to explain this to research participants, simply and clearly. Who is going to do that? I think AI is an opportunity to rethink the upfront education of patients in clinical trials. That’s a step forward, too. Patients want to be part of the process. Patients are end users of AI.  

Andreas Lysandropoulos: I want to say, it’s very good that FDA is inviting innovators to develop AI-based tools like this. Sponsors are seen as cautious and conservative. But this is an example of FDA taking the lead to establish trust, to help sponsors be more open and apply the innovations that are going to improve research and reduce costs. This is the way to change the world.

Stephen Pyke: AI is already changing the way we do drug development, especially when it comes to understanding human biology—helping to identify druggable targets for new medicines, and to select the most promising molecules to address those targets. AI-COA is a nice example of AI moving into the clinical phases. A recent publication by FDA staffers points to exponential growth of the number of submissions that include AI data and technologies.3 We can see a future for clinic-based AI assessment tools that clinicians could use to support diagnosis and to track patient response to treatment. Farther out, maybe home-based AI tools will support remote data collection for decentralized trials. Today, AI-COA is big news—cause for celebration we hope, and just the first. 

 

References

1 FDA's ISTAND Pilot Program accepts submission of first artificial intelligence-based and digital health technology for neuroscience.

2 Deliberate AI announces its depression and anxiety model, AI-COA™, is the first AI/ML initiative to be accepted into the FDA’s ISTAND Pilot Program to advance drug development. 

3 Liu, Q et al.  Landscape analysis of the application of artificial intelligence and machine learning in regulatory submissions for drug development from 2016 to 2021.  Clin Pharm & Therapeutics; 113(4): April 2023.
 

Return to Insights Center

Related Insights

Blog

Leveraging the draft FDA Guidance on PBPK for your drug development program

Feb 24, 2021

Playbook

Are you using real-world evidence?

Feb 1, 2023

Blog

Summary and assessment: Using Artificial Intelligence & Machine Learning in the Development of Drug & Biological Products: Discussion Paper and Request for Feedback (FDA)

Mar 7, 2024

Blog

Summary and assessment of EMA’s reflection paper on the use of artificial intelligence (AI) in the medicinal product lifecycle

Mar 7, 2024

Blog

Regulatory acceptability of AI: Current perspectives

Mar 7, 2024

Podcast

RBQM Podcast Series | Episode 3: Staying within the Guardrails: How to Push the Boundaries in a Highly Regulated Industry

Jun 16, 2022

Blog

Accelerating Delivery and Patient Access to Rare Disease Treatments – Highlights from World Orphan Drug Congress

May 2, 2024

Article

8 things you need to know about eCTDs in China

Jul 1, 2022

Blog

CNS Summit Recap: The Future is Collaborative

Nov 22, 2021

Video

Creating EU-CTR compliant and patient-friendly lay language summaries (LLS)

Jan 26, 2022

Article

New endpoints for early-stage cancer are gaining regulatory traction

Jan 28, 2022

Article

How biotechs can strengthen their value story with advanced analytics

Feb 15, 2022