The Food and Drug Administration (FDA) launched a new generative AI tool it hopes will make employees more efficient on everything from clinical evaluations to inspections. The AI is called Elsa, and it was developed in-house to help with reading, writing and summarizing.
“As we learn how employees are using the tool, our development team will be able to add capabilities and grow with the needs of employees and the agency,” FDA Chief AI Officer Jeremy Walsh, said in a statement.
What will the AI do?
FDA leadership is looking to modernize the entire department, and it’s already making a difference. Elsa shortens the time needed for scientific evaluations, summarizes adverse effect reports for medicines and therapeutics, expedites clinical reviews, generates code to develop non-clinical databases, and helps inspectors determine where they are most likely to find violations and safety issues.
How will they ensure it’s secure and accurate?
The FDA said Elsa is secure and relies only on internal documents.
“All information stays within the agency, and the AI models are not being trained on data submitted by the industry,” FDA Commissioner Marty Makary said.
Elsa is a tool designed to support experts who retain full responsibility for reviewing and verifying all information before publication or regulatory decision-making, according to the FDA.
Even with safeguards, there is potential for error. The Trump administration learned that with the recent release of the Make America Healthy Again report, which cited studies that don’t actually exist. Experts say those types of mistakes happen with generative AI use, and they’re called AI hallucinations.
White House Press Secretary Karoline Leavitt said there were “formatting issues” with the report but it did not negate the substance.
Is this the first federal agency to create its own AI system?
The federal government doubled its AI use from 2023 to 2024. According to data collected by the Chief Information Officers Council, the biggest users are the departments of Health and Human Services, Veterans Affairs, Interior and Homeland Security.
Approximately 46% of the uses are what’s described as mission-enabling, including finance management, human resources, and facilities and property management. It’s also used for cybersecurity, IT and other administrative functions.
More than 40% of the AI code is custom developed and publicly available, which the CIO Council contends will help with collaboration and innovation.