In an ASHP Midyear 2023 Clinical Meeting & Exhibition Management Case Study titled “Artificial Intelligence (AI) and Machine Learning: New Horizons in the Development of Drug and Biological Products,” Tala Fakhouri, PhD, MPH, associate director for policy analysis, FDA, discussed human-led governance, accountability, and transparency in responsible AI use for drug development; assessment of quality, reliability, and representativeness of data; and model development, performance, monitoring, and validation of responsible technology use.

Dr. Fakhouri’s goals were to discuss how AI (particularly machine learning [ML]) is used in the development of biological products; to discuss how the FDA was approaching AI/ML; and explain what the FDA was doing to create a regulatory framework, partly in response to the order on safe, secure, and trustworthy AI that was signed by the Biden-Harris Administration on October 30, 2023.

Dr. Fakhouri explained that the drivers behind the growth in AI health applications include:

• Large datasets (e.g., administrative data, electronic health records, registries, etc.)
• Diverse and multimodal datasets (e.g., digital health technologies, genomic, laboratory, imaging, etc.)
• Improvements in data standards (e.g., ICD-10, National Drug Codes, etc.)
• Improved data interoperability and healthcare data exchange
• Increased computing power
• Advancements in data privacy persevering approaches
• Breakthroughs in methods (e.g., deep neural networks, reinforcement learning, generative adversarial networks, variational autoencoders, etc.) and causal inference approaches (e.g., structural causal models and causal Bayesian networks).

She presented some challenges with AI in drug development, and she noted that “…the FDA strives to address these issues over the next few years.” They include:

• AI or ML approaches can only ever be as good as the underlying data. This means a scarcity of high-quality, large-scale, and fit-for-purpose datasets for development and testing, as well as identification and mitigation of bias in datasets
• Poor generalization due to dataset shift, to overfitting to confounders
• Opacity of some algorithms (also called Black Box Algorithms)
• Ensuring transparency to users
• Providing oversight/governance for adaptive algorithm
• Data privacy and security, and ethical use of the technology
• Need for regulatory clarity in certain areas.

Dr. Fakhouri explained that in May 2023, the FDA published a discussion paper that was meant to describe current and potential future users of AI and drug development and address specific questions around three areas: human-led governance, accountability, and transparency; quality, reliability, and representativeness of data; and model development, performance, monitoring, and validation. According to the discussion paper, the “FDA will continue to solicit feedback and engage a broad group of stakeholders to further discuss considerations for utilizing AI/ML throughout the drug development life cycle.”

In conclusion, Dr. Fakhouri stressed that, “Technology is now everywhere. It’s happening. It’s a reality. It’s no longer something that is in the future, but what’s very important right now is for all of us in the healthcare setting to make sure that it’s being used in a responsible way.”

The content contained in this article is for informational purposes only. The content is not intended to be a substitute for professional advice. Reliance on any information provided in this article is solely at your own risk.