• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Could AI Help Mitigate Radiology Workloads for Unremarkable Chest X-Rays?

News
Article

New research suggests the use of an AI software may help rule out pathology on over 50 percent of unremarkable chest X-rays with a 98 percent sensitivity threshold.

Artificial intelligence (AI) software may facilitate an approximately 20 percent reduction in case volume for chest X-rays, according to emerging research.

For the retrospective study, recently published in Radiology, researchers examined off-label use of the AI software Annalise Enterprise CXR, version 2.2 (Annalise.ai) for ruling out pathology on unremarkable chest X-rays. The cohort of 1,961 patients (median age of 72) with chest radiographs was drawn from four separate hospitals, according to the study. The researchers noted that 730 of the X-rays (37.2 percent) were deemed to have unremarkable findings.

At a 98 percent sensitivity threshold, the researchers found that the AI software had a 94.1 percent negative predictive value (NPV) and demonstrated a 52.7 percent specificity rate for ruling out pathology on unremarkable chest X-rays.

Could AI Help Mitigate Radiology Workloads for Unremarkable Chest X-Rays?

Here one can see missed critical findings on chest X-ray, including an acute rib fracture (A), enlarged hilar lymph nodes (B), a tumor mimicking pleural plaque (C) and a central venous catheter possibly entering the azygos vein (D). The AI software missed the diagnosis in case A, detected the findings in case B at all sensitivity thresholds and made an accurate diagnosis in cases C and D at 99 percent and 99.9 percent sensitivity thresholds. (Images courtesy of Radiology.)

In comparison to radiology reports, the AI software had nearly double the rate of critical misses (17.1 percent vs. 8.9 percent), according to the study authors. However, when utilizing a sensitivity threshold of 95.4 percent, the researchers said the AI software had a 4.6 percent rate for missing remarkable findings on chest X-rays in contrast to 12.8 percent for radiology reports.

“This means that AI could theoretically have been used at sensitivity thresholds greater than 95.4% without decreasing patient safety relative to the current standard, which could have resulted in the automatic reporting of up to 63.2% of unremarkable chest radiographs,” wrote lead study author Louis Lind Plesner, M.D., who is affiliated with the Department of Radiology at Herlev and Gentofte Hospital in Copenhagen, Denmark, and colleagues.

Emphasizing the concerning triad of increasing imaging volume, radiologist shortages and high burnout rates, the study authors suggested that deployment of AI may help mitigate the high workload of chest radiography.

In their findings, the researchers noted that the sensitivity threshold of 98 percent sensitivity and aforementioned specificity rate of 52.7 percent for the AI software could lead to a 19.6 percent reduction in case volume for chest X-rays.

“These results should be evaluated in a prospective study due to the high potential for mitigating workload challenges in radiology departments,” maintained Plesner and colleagues.

Three Key Takeaways

1. AI efficiency in workload reduction. AI software could potentially reduce the case volume of chest X-rays by approximately 20 percent, easing the workload of radiologists and allowing them to focus on more complex cases.

2. Sensitivity thresholds for AI use. The study found that using the AI software at a sensitivity threshold of 95.4 percent could safely automate the reporting of up to 63.2 percent of unremarkable chest radiographs without compromising patient safety relative to current standards in radiology.

3. Need for further research. Despite promising results, researchers highlighted the necessity of prospective studies to validate the effectiveness and safety of AI in clinical settings, especially given the varying accuracy across different facilities.

In an accompanying editorial, Soon Ho Yoon, M.D., Ph.D., and Eui Jin Hwang, M.D., Ph.D., said the study’s inclusion of chest X-rays from different clinical settings and at different projections in the multicenter cohort “provides a more comprehensive and realistic measure of AI-based automated reporting of unremarkable chest radiographs.”

While expressing concern over varying accuracy and case volume reductions with the AI software between the participating facilities in the study, Drs. Yoon and Hwang emphasized the need for future research to explore the potential merits of AI automation in reducing chest X-ray volume for radiologists.

“Reading unremarkable chest radiographs is a substantial burden in radiologists’ daily workload. Delegating some of this task to AI could allow radiologists to concentrate on more complex and clinically consequential tasks, potentially improving their work-life balance,” added Drs. Yoon and Hwang, clinical associate professors in the Department of Radiology at the Seoul National University Hospital in Seoul, South Korea.

(Editor’s note: For related content, see “Comparative Study Evaluates AI Products for Detecting Tuberculosis on Chest X-Rays,” “Can AI Facilitate Improved Dual-Energy X-Ray Screening for Patients at High Risk of Osteoporosis?” and “Study Finds Four Out of Seven AI Algorithms Offer Better Lung Nodule Detection on X-Rays than Radiologists.”)

Beyond the inherent limitations of a retrospective study design, the authors noted the lack of a standard definition of what constitutes an unremarkable chest X-ray. They also acknowledged that all clinical consequence assessments for missed diagnoses were performed by one reader.

Recent Videos
Can Fiber Optic RealShape (FORS) Technology Provide a Viable Alternative to X-Rays for Aortic Procedures?
Nina Kottler, MD, MS
The Executive Order on AI: Promising Development for Radiology or ‘HIPAA for AI’?
Expediting the Management of Incidental Pulmonary Emboli on CT
Related Content
© 2024 MJH Life Sciences

All rights reserved.