Multimodal generative artificial intelligence (AI) may facilitate significant efficiencies with radiology reports as well as enhanced sensitivity for an array of abnormalities on chest X-rays (CXRs).
In a retrospective study, recently published in Radiology, researchers compared the use of preliminary AI-generated reports (via AIRead, Soombit.ai) and unassisted interpretation for 758 CXRs by five reviewing radiologists.
Use of the AI-generated reports led to a 42 percent reduction in the average reading time for CXRs in comparison to unassisted evaluation by radiologists (19.8 seconds vs. 34.2 seconds), according to the study authors.
In a subset analysis of 258 cases, the researchers found the use of AI-generated reports led to a nearly 10 percent increase in sensitivity for pleural lesions (87.4 percent vs. 77.7 percent) and a greater than six percent increase in sensitivity for widened mediastinum (90.8 percent vs. 84.3 percent).
“The results … demonstrated that the introduction of AI-generated reports could generally increase the efficiency and quality of radiologic interpretations, decrease the reading time, and improve the accuracy of the reports,” wrote lead study author Eun Kyoung Hong, M.D., who is affiliated with the Departments of Radiology at Mass General Brigham and Brigham and Women’s Hospital in Boston, and colleagues.
Without AI-generated reports, the researchers noted a broad range of sensitivities (54.2 percent to 80.7 percent) and specificities (84.9 percent to 93.4 percent) among the five reviewing radiologists for detecting abnormalities on CXR. However, the ranges for sensitivity and specificity rates were narrower with the use of AI-generated reports. The study authors noted sensitivity rates ranging between 71.1 and 80.8 percent with AI-generated reports, and specificity rates between 85.2 and 87.3 percent.
Three Key Takeaways
1. Increased efficiency. AI-generated reports led to a 42 percent reduction in reading time for chest X-rays (CXRs), improving workflow efficiency for radiologists.
2. Enhanced sensitivity for a variety of abnormalities. The AI-assisted approach improved sensitivity for detecting abnormalities such as pleural lesions (+9.7 percent) and widened mediastinum (+6.5 percent), while also homogenizing diagnostic performance across radiologists.
3. Variable performance. While AI-enhanced reporting improved sensitivity for conditions like consolidation (+17.9 percent) and lung opacity (+22.5 percent), it had lower sensitivity for lung nodules (80 percent vs. 86.7 percent) compared to unassisted radiologists.
“ … The factual correctness analysis demonstrated that the diagnostic performances of the reports became somewhat homogeneous across radiologists after the introduction of AI-generated reports,” pointed out Hong and colleagues.
While the researchers noted double-digit increases in sensitivity for consolidation (74.7 percent vs. 56.8 percent) and lung opacity (51.3 percent vs. 28.8 percent) with the AI-generated reports, they cautioned that the AI-generated reporting software had less sensitivity for lung nodules (80 percent vs. 86.7 percent) than unassisted radiologist assessment.
(Editor’s note: For related content, see “Study: AI Bolsters Sensitivity for Pneumothorax on CXR and Significantly Reduces Reporting Time,” “Can Portable Dual-Energy X-Ray be a Viable Alternative to CT in the ICU?” and “How Embracing Technologies May Help Mitigate Burnout and the Radiologist Shortage.”)
In regard to study limitations, the authors noted the retrospective design, the small number of reviewing radiologists (five) and a lack of comparisons with previous X-rays.