In an ambitious move to modernize the drug and device approval process, the FDA introduced “Elsa”—an AI assistant designed to fast-track reviews. However, a CNN investigation reveals that Elsa frequently “hallucinates” fake studies and misrepresents research, making it unreliable for high-stakes regulatory decisions.
While the FDA touts Elsa’s ability to summarize documents and identify priority inspections, internal staff have expressed serious concerns. Six current and former officials say the tool is only marginally useful for tasks like email templates, and not viable for scientific reviews due to its factual inaccuracies. Elsa can’t even access core regulatory submissions, limiting its utility in real-world FDA workflows.
Despite public claims by HHS Secretary Robert F. Kennedy Jr. and FDA Commissioner Dr. Marty Makary about Elsa’s efficiency, internal feedback paints a picture of a premature rollout lacking robust oversight. As of now, use of Elsa is optional, and adoption within FDA departments appears limited.