A comparative study assessing stand-alone ability of seven artificial intelligence (AI) software modalities for lung nodule detection on chest X-rays found that four AI modalities offered better performance in comparison to a mean score of reviewing radiologists.
For the retrospective, multicenter, comparison study, recently published in Radiology, researchers compared seven commercially available AI modalities and mean performance of 17 reviewing radiologists. The cohort was comprised of 386 patients (mean age of 64) and 144 of the patients had at least one nodule, according to the study.
The reviewed AI modalities included Annalise Enterprise CXR (Annalise.ai), InferRead DR Chest (Infervision), INSIGHT CXR (Lunit), Milvue Suite-Smart Urgencies (Milvue), ChestEye (Oxipit), AI-Rad Companion Chest X-ray (Siemens Healthineers) and Med-Chest X-Ray (VUNO).
The researchers found that INSIGHT CXR, Annalise Enterprise CXR, Milvue Suite-Smart Urgencies and ChestEye offered higher stand-alone AUCs (93 percent, 90 percent, 86 percent, and 88 percent respectively) than the mean AUC of reviewing radiologists (81 percent). The study authors noted that InferRead DR Chest, AI-Rad Companion Chest X-Ray and Med-Chest X-Ray showed no difference than radiologists for stand-alone detection of lung nodules.
INSIGHT CXR also demonstrated the highest sensitivity rate (89 percent), which was 18 percent higher than the mean sensitivity rate for radiologists (71 percent) and 14 percent higher than the next highest AI modality sensitivity rate (75 percent for Med-Chest X-Ray). Sensitivity and specificity rates were not available for Annalise Enterprise CXR and ChestEye, according to the researchers.
“For lung nodule detection on chest radiographs, four AI algorithms showed better performance than human readers, and three AI algorithms showed no evidence of a difference in performance compared with human readers,” wrote lead study author Kicky G. van Leeuwen, MSc, who is affiliated with the Department of Medical Imaging at Radboud University Medical Center in Nijmegen, the Netherlands, and colleagues.
Three Key Takeaways
- Some of the AI modalities outperformed radiologists in lung nodule detection. The study revealed that four AI modalities — INSIGHT CXR, Annalise Enterprise CXR, Milvue Suite-Smart Urgencies, and ChestEye— demonstrated better stand-alone performance in lung nodule detection on chest X-rays compared to the mean performance of reviewing radiologists. This suggests the potential of AI to enhance diagnostic capabilities on chest X-rays.
- INSIGHT CXR showed superior sensitivity. INSIGHT CXR exhibited the highest sensitivity rate (89 percent) among the AI modalities, surpassing both the mean sensitivity rate of radiologists (71 percent) and the next highest AI modality sensitivity rate (75 percent for Med-Chest X-Ray). This emphasizes the capability of INSIGHT CXR in identifying lung nodules, which is crucial for early and accurate diagnosis of lung cancer.
- Varied performance across nodule characteristics. For nodules with very subtle conspicuity, INSIGHT CXR and Annalise Enterprise CXR had the highest AI area under the curves (AUCs). Additionally, for the smallest nodule diameter (5 to 9 mm2), INSIGHT CXR, AI-Rad Companion Chest X-ray, and Annalise Enterprise CXR demonstrated superior AUCs compared to other AI products and the mean AUC for radiologists. This underscores the importance of considering nodule characteristics when evaluating AI performance in lung nodule detection.
For nodules deemed to have very subtle conspicuity, INSIGHT CXR and Annalise Enterprise CXR had the highest AI AUCs at 76 percent and 70 percent respectively. The other five AI modalities had lower AUCs than radiologists (65 percent) in this patient population.
For the smallest nodule diameter (5 to 9 mm2), INSIGHT CXR, AI-Rad Companion Chest X-ray and Annalise Enterprise CXR demonstrated the highest AI AUCs (93 percent, 89 percent, and 87 percent respectively) while none of the other AI products surpassed the mean AUC for radiologists (86 percent).
The study authors emphasized that their comparative study protocol enhances transparency in the ever-burgeoning AI market in radiology.
“It is conceivable that in the future, radiology departments will require vendors to participate in transparent and comparative evaluations as a prerequisite for purchasing AI products,” added van Leeuwen and colleagues. “Similarly, health care insurers might base reimbursement decisions on performance comparisons.”
(Editor’s note: For related content, see “FDA Clears AI Software for Enhanced Lung Nodule Detection on Chest X-Rays,” “Study: AI Enhances Abnormality Detection on CXR Across Radiologist Experience Levels” and “Study Shows Benefits of AI in Detecting Lung Cancer Risk in Non-Smokers.”)
In regard to study limitations, the researchers noted the study’s focus on stand-alone performance of the modalities in the study despite the fact that the majority of commercial AI software products are intended for adjunctive decision support. They also acknowledged that no clinical information was disclosed and that reviewing radiologists read images on a Grand Challenge platform as opposed to a PACS system. Several vendors with eligible AI products declined to participate in the research, according to the study authors.