Radiologists’ experience allows them to unconsciously detect lung nodules on CT, from RSNA 2016.
Unlike radiologists, medical students do not show unconscious detection of lung nodules, according to a study presented at RSNA 2016.
Researchers from Massachusetts studied the unconscious detection of lung nodules among radiologists and evaluated eye movement metrics in untrained readers (1st year medical students).
Twenty-four subjects participated in the study, 12 radiologists and 12 medical students. They interpreted nine normal and nine abnormal axial chest CT scans. The abnormal scans contained 16 lung nodules. Eye tracking was followed for location and duration of gaze, while visual dwell time on healthy tissue versus on a lung nodule, the number of total eye movements (saccades), and the total number of images viewed were used to evaluate the efficiency of visual search patterns by both groups.
The results showed that both radiologists and students dwelled longer on the nodules, compared with healthy lung tissue. The radiologists also dwelled longer on the nodules when they were not consciously detected. However, the medical students did not fixate longer on a lung nodule versus healthy lung tissue when not consciously detected.
The radiologists scrolled through the image set 2.5 times more than the students and made significantly more saccades than the student, with an average of 376 versus 215, respectively. However, radiologists were significantly more efficient, making on average 0.46 saccades per image while the students made 0.62. The students also bounced from one location to another across the entire image set before moving on from that image, and only rarely returned to an image they looked at previously.
The researchers concluded that unlike radiologists, medical student do not show unconscious detection of lung nodules, with a search pattern and efficiency of search significantly worse for the students, compared with the radiologists. “These data suggest that during the process of radiological training, both conscious and unconscious learning is developed that influence the success of the search, the efficiency of the search, and the pattern in which the search is undertaken,” they wrote. “Although some component of radiological learning is the result of specific training and conscious processes, additional unconscious learning likely occurs that influences radiological performance.”
CT Study Reveals Key Indicators for Angiolymphatic Invasion in Non-Small Cell Lung Cancer
January 15th 2025In computed tomography (CT) scans for patients with solid non-small cell lung cancer (NSCLC) < 30 mm, emerging research suggests the lollipop sign is associated with a greater than fourfold likelihood of angiolymphatic invasion.
New CT and MRI Research Shows Link Between LR-M Lesions and Rapid Progression of Early-Stage HCC
January 2nd 2025Seventy percent of LR-M hepatocellular carcinoma (HCC) cases were associated with rapid growth in comparison to 12.5 percent of LR-4 HCCs and 28.5 percent of LR-4 HCCs, according to a new study.
The Reading Room: Artificial Intelligence: What RSNA 2020 Offered, and What 2021 Could Bring
December 5th 2020Nina Kottler, M.D., chief medical officer of AI at Radiology Partners, discusses, during RSNA 2020, what new developments the annual meeting provided about these technologies, sessions to access, and what to expect in the coming year.
Can AI Facilitate Single-Phase CT Acquisition for COPD Diagnosis and Staging?
December 12th 2024The authors of a new study found that deep learning assessment of single-phase CT scans provides comparable within-one stage accuracies to multiphase CT for detecting and staging chronic obstructive pulmonary disease (COPD).
Study Shows Merits of CTA-Derived Quantitative Flow Ratio in Predicting MACE
December 11th 2024For patients with suspected or known coronary artery disease (CAD) without percutaneous coronary intervention (PCI), researchers found that those with a normal CTA-derived quantitative flow ratio (CT-QFR) had a 22 percent higher MACE-free survival rate.