Artificial Intelligence and Machine Learning in the Development of Drug and Biological Products

BakerHostetler
Contact

BakerHostetler

On May 10, the U.S. Food and Drug Administration (FDA) published a discussion paper, “Using Artificial Intelligence & Machine Learning in the Development of Drug & Biological Products.” The paper is a collaboration between FDA’s Center for Drug Evaluation and Research (CDER), the Center for Biologics Evaluation and Research (CBER), and the Center for Devices and Radiological Health (CDRH), including its Digital Health Center of Excellence (DHCoE). The paper is neither FDA guidance nor policy but aims to initiate stakeholder feedback and promote subsequent discussions to help inform future regulatory activities.   

The paper discusses the considerations and concerns for the use of artificial intelligence and machine learning (AI/ML) in drug development. While some existing standards and practices may be applicable to AI/ML in drug development, there are specific challenges that need to be addressed. FDA emphasizes adopting a risk-based approach when evaluating and operating AI/ML in an effort to foster innovation and safeguard public health.

Adapting principles from the General Accountability Office’s AI accountability framework, FDA is seeking feedback on three key areas related to AI/ML in drug development:

  1. Human-led governance, accountability and transparency: Human-led AI/ML governance ensures adherence to legal and ethical values in drug development. It includes risk management, documentation, transparency and explainability. Transparency provides insights into AI/ML processes, while explainability offers evidence for outputs.
  2. Quality, reliability and representativeness of data: AI/ML’s reliance on data makes data attributes crucial. Issues such as bias, integrity, privacy, provenance, relevance and replicability should be considered, while early clarification of data access is essential to the process. 
  3. Model development, performance, monitoring and validation: This area addresses the development, performance assessment, monitoring and validation of AI/ML models used in drug development. It highlights the need for rigorous evaluation of model performance, ongoing monitoring for potential biases or errors, and validation against established standards.

For each of the three areas, the discussion paper provides specific questions aimed to solicit feedback and shape the future regulatory landscape for AI/ML in drug development. 

The release of this discussion paper is part of a larger effort to engage in and initiate dialogue with stakeholders, exploring considerations for the use of AI/ML in developing human drugs and biological products. It reflects FDA’s commitment to engaging with diverse perspectives and shaping the regulatory landscape for AI/ML applications in drug development.

Given the rapid advancements and potential impact of AI/ML, FDA recognizes the need for adaptation and eventual regulatory oversight. FDA is actively working towards this, and it is anticipated there will be continued regulatory updates and direction from FDA in this domain.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© BakerHostetler | Attorney Advertising

Written by:

BakerHostetler
Contact
more
less

BakerHostetler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide