• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

Can Deep Learning Automate Amyloid Positivity Assessment on Brain PET Imaging?

News
Article

In validation testing with 205 18F florbetapir PET scans from 95 patients with Alzheimer’s disease, a deep learning model demonstrated a 93.2 percent accuracy rate and a 97 percent AUC for detecting amyloid-β positivity.

New research suggests that an emerging deep learning model may provide automated, highly accurate differentiation of amyloid positivity on positron emission tomography (PET) in multiple data sets without the need for structural magnetic resonance imaging (MRI).

For the retrospective study, recently published in Radiology, researchers trained a deep learning model, AmyloidPETNet, on 1,538 brain PET scans from 766 patients in a data set from the Alzheimer’s Disease Neuroimaging Initiative (ADNI). After internal testing of the model, researchers performed validation testing with 205 brain 18F florbetapir PET (18F-FBP PET) scans from 95 patients in the ADNI data set.

In validation testing, the researchers found that the deep learning model had a 93.2 percent accuracy rate and a 97 percent area under the receiver operating characteristic curve (AUC) for detecting amyloid-β positivity on brain PET scans.

Can Deep Learning Automate Amyloid Positivity Assessment on Brain PET Imaging?

Beyond the 97 percent AUC in original validation testing for assessing amyloid positivity, AmyloidPETNet, an emerging deep learning model, had AUC rates ranging between 96 to 99 percent in four other data sets, including the Anti-Amyloid Treatment in Asymptomatic Alzheimer's Disease Study (A4) data set, which was comprised of 4,448 brain PET scans.

The AmyloidPETNet deep learning model was subsequently evaluated in four other data sets with 1,329 PET scans from the Open Access Series of Imaging Studies 3 (OASIS3), 127 PET scans from the Centiloid Project, 4,448 PET scans from the Anti-Amyloid Treatment in Asymptomatic Alzheimer’s Disease (A4) and 103 PET scans from the Study to Evaluate Amyloid in Blood and Imaging Related to Dementia (SEABIRD), according to the study.

For these data sets, the researchers noted that the AmyloidPETNet deep learning model had accuracy rates ranging between 91.5 to 97.6 percent, and AUC rates ranging between 96 to 99 percent.

“The main advantage of our approach is that it can provide a rapid and accurate one-step prediction of amyloid status based on native-space PET regardless of imaging equipment, reconstruction algorithms, and Aβ tracer. … As a core AD biomarker, Aβ PET may significantly impact the diagnosis and treatment of patients with mild cognitive impairment, help rule out AD etiology, increase the overall diagnosis confidence, and alter prescribed medication,” wrote lead study author Shuyang Fan, M.D., who is affiliated with the Departments of Engineering and Radiology at Rice University in Houston, and colleagues.

Three Key Takeaways

1. High accuracy and consistency. AmyloidPETNet demonstrated high accuracy (91.5-97.6 percent) and AUC rates (96-99 percent) across multiple data sets, indicating reliable performance in detecting amyloid-β positivity regardless of imaging equipment, reconstruction algorithms, and Aβ tracer used.

2. Clinical impact for patients with suspected Alzheimer's disease. As a core biomarker for Alzheimer's disease (AD), the study authors suggested the deep learning model may significantly improve diagnosis and treatment of patients with mild cognitive impairment, help rule out AD etiology, and increase overall diagnostic confidence, potentially leading to more appropriate medication prescriptions.

3. Overcoming variability with PET tracers. The deep learning model may help mitigate interobserver variability and heterogeneity in interpretation protocols for different PET tracers, and perhaps assist physician trainees with limited experience.

The researchers also noted that the deep learning model had consistent results despite the use of different PET tracers within other data sets. While the AmyloidPETNet deep learning model was trained on 18F-FBP PET) scans, the study authors noted AUCs of 99 percent with 11C-labeled Pittsburgh Compound-B (11C-PIB) PET scans from the SEABIRD data set and 98 percent with 18F-NAV4694 PET scans from the Centiloid Project data set.

“AmyloidPETNet could help overcome the interobserver variability and heterogeneity of interpretation protocols for different tracers and assist physician trainees with limited experience,” added Fan and colleagues.

(Editor’s note: For related content, see “Could MRI-Guided Ultrasound Facilitate Improved Reduction of Amyloid-Beta Load in Patients with Alzheimer’s Disease?,” “Emerging MRI and PET Research Reveals Link Between Visceral Abdominal Fat and Early Signs of Alzheimer’s Disease” and “What Does the Future Hold for Medicare Coverage of Amyloid PET Scans?”)

In regard to study limitations, the authors acknowledged that the cohort in this study was primarily comprised of pre-selected cases from Alzheimer’s disease cohorts and that patients with other vascular pathology and coexisting neurodegenerative disease were largely excluded. The researchers also noted the lack of a quantitative measurement for Aβ pathology and conceded that topologic and spatial distribution of pathology could not be captured due to reliance on labels from mean cortical SUV ratios.

Related Videos
Emerging Research at SNMMI Examines 18F-flotufolastat in Managing Primary and Recurrent Prostate Cancer
Could Pluvicto Have a Role in Taxane-Naïve mCRPC?: An Interview with Oliver Sartor, MD
New SNMMI President Cathy Cutler, PhD, Discusses Current Challenges and Goals for Nuclear Medicine
Can Fiber Optic RealShape (FORS) Technology Provide a Viable Alternative to X-Rays for Aortic Procedures?
Emerging MRI and PET Research Reveals Link Between Visceral Abdominal Fat and Early Signs of Alzheimer’s Disease
Nina Kottler, MD, MS
The Executive Order on AI: Promising Development for Radiology or ‘HIPAA for AI’?
Related Content
© 2024 MJH Life Sciences

All rights reserved.