ACR and RSNA leaders request the FDA slow its momentum with AI tools that work without radiologist oversight.
Artificial intelligence (AI) tools that operate autonomously in radiology are not yet safe, and pursuing their use should be done carefully, two industry leadership organizations have told the U.S. Food & Drug Administration (FDA).
In a letter detailing their reticence, the American College of Radiology and the Radiological Society of North America leaders expressed concerns that these AI tools have not been sufficiently tested to ensure their safety – and that a framework is not currently in place to address that problem. This message comes in response to a two-day FDA workshop convened earlier this year to assess AI’s integration into medical imaging.
“The ACR and RSNA believe it is unlikely the FDA could provide reasonable assurance of the safety and effectiveness of autonomous AI in radiology patient care without more rigorous testing, surveillance, and other oversight mechanisms throughout the total product life cycle,” said Howard Fleishon, M.D., ACR chairman, and Bruce Haffty, M.D., RSNA chairman, in the letter. “We believe this level of safety is a long way off, and while AI is poised to assist physicians in the care of their patients, autonomously functioning AI algorithms should not be implemented at this time.”
Related Content: American College of Radiology Offers Suggestions for Federal Artificial Intelligence Oversight
To avoid potential problems with patient safety, care, and outcomes, both leaders urged the FDA to proceed slowly, specifically waiting until more radiologists are comfortable using AI algorithms.
Current AI Use
Much of their concern stems from the relatively small use of AI algorithms in the industry to date. According to a recent ACR survey, only 30 percent of radiologists use AI in clinical practice, averaging 1.2 algorithms per radiologist. Instead, most providers use the tools solely for research.
Approved algorithms on the market are not intended to work without human oversight. Instead, they assist in image interpretation and exam prioritization, as well as help with some administrative tasks. In most cases, the tools were not tested to be generalizable across a heterogenous patient population or across varying modalities, the leaders said.
“It is not surprising that 93 percent of radiologists using AI in our survey said that the results of AI in their practices are inconsistent, and 95 percent said they would not use the AI algorithms without physician overread,” they wrote.
Recommendations to the FDA
If the goal is to continuing pursuing autonomous AI in radiology, Fleishon and Haffty said, there are various steps the FDA should consider. They recommended:
Safeguarding Radiologist Involvement
Most importantly, Fleishon and Haffty pointed out, safe image interpretation, such as the analysis of screening mammography, still requires active radiologist involvement, and removing the provider and the context he or she brings to each interpretation could have negative consequences.
“We are not confident that these examinations can be safely excluded from radiologist interpretation while maintaining the current level of patient safety,” they wrote.
Overall, they recommended the FDA focus its efforts in algorithms that help providers with population health as a way to integrating autonomous AI into radiology care, such as concentrating on algorithms that can incidentally detect and quantify potentially undiagnosed chronic disease. The human touch is still necessary when examining, recognizing, and characterizing disease, they said.
“The value that human interpretation with independent medical judgement brings to patient care cannot currently be replaced,” they said.
New Study Examines Short-Term Consistency of Large Language Models in Radiology
November 22nd 2024While GPT-4 demonstrated higher overall accuracy than other large language models in answering ACR Diagnostic in Training Exam multiple-choice questions, researchers noted an eight percent decrease in GPT-4’s accuracy rate from the first month to the third month of the study.
FDA Clears AI-Powered Ultrasound Software for Cardiac Amyloidosis Detection
November 20th 2024The AI-enabled EchoGo® Amyloidosis software for echocardiography has reportedly demonstrated an 84.5 percent sensitivity rate for diagnosing cardiac amyloidosis in heart failure patients 65 years of age and older.