Liability and responsibility concerns are not significant with AI use now, but as the tools enter clinical use, that will change.
The push toward greater utilization of artificial intelligence in radiology (AI) shows no signs of slowing, so it would be a good idea to brush up on the legal aspects of using it, a group of industry experts said.
In an April 9 perspective article published in Skeletal Radiology, legal and radiology experts from Harvard noted that the use of AI in diagnostic imaging is largely uncharted territory. Some risks to providers, practices, and technology developers have been identified, but more could be revealed over time.
To help you prepare, H. Benjamin Harvey, M.D., J.D., a neuroradiologist from Massachusetts General Hospital, and Vrushab Gowda, J.D., a Harvard Law School graduate who is also pursuing his medical degree from the University of North Carolina at Chapel Hill School of Medicine, discussed some specifics of the liabilities around AI use in musculoskeletal imaging.
General negligence
These cases typically deal with medical malpractice, Harvey and Gowda said. While plaintiffs have the burden of showing four distinct occurrences of negligence, courts will also rely on professional society standards, as well as community practice and department protocols. Consequently, it is vital to create guidelines, complete sufficient user training, and correctly integration AI.
In addition, AI is extremely helpful, but if you rely on it alone – rather than using the tool and deciding to dismiss its findings and recommendations – your malpractice risk will be greater, they said.
Informed consent
When it comes to using AI with medical imaging, there are two main concerns, they said – do you let your patients know you’re using AI, and if you do, what do you need to tell them?
Given that AI is largely used for research investigations currently, this issue hasn’t crept up much. But, once the tools are implemented clinically, the story will be different, especially if the technology eventually becomes autonomous further downstream.
Ultimately, Harvey and Gowda said, proper handling of informed consent will likely require a patient-based or provider-based standard, and the difference between them can be unclear.
“The rub of the matter is that radiologists should be cognizant of their peers’ disclosures and anticipate the sorts of information patients would find important when deploying AI,” they said.
Risks for radiology groups
When mistakes happen, patients are going to look for someone to hold accountable, they said. If you’ve used an AI tool that results in a negative outcome, your hospital or radiology practice could end up as the defendant – not you.
This situation of “vicarious liability” can be sticky, they explained, because it puts your practice or facility leadership in the hot seat. Consequently, there is a need for clearly written protocols on how to handle these potential situations.
Developer responsibility
You aren’t the only one subjecting yourself to liability and risk when you use an AI tool. The developer also bears responsibility for when things go wrong, such as mislabeling findings or recommending a biopsy – or worse, not recommending one.
Under a strict liability framework, if a plaintiff can prove that the mistake is a result of a manufacturing problem – which could be tough to do – they could win their case. Conversely, negligence principles call for examining the features of the healthcare facility rather the product.
Either way, they said, radiologists need to be aware about these possibilities and stay current with what’s going on.
With a lack of clarity around liability and responsibility with the future use of artificial intelligence, Harvey and Gowda both said, as radiologists, you must make sure your voice is part of the discussion around developing stringent AI imaging standards.
“Looking forward, imaging departments should articulate clear protocols for their use, to include procedures in the event of human-AI discrepancy,” they said. “They can be aided by [American College of Radiology]- and [Society of Skeletal Radiology]-validated guidelines, training programs, and model practices for deploying AI technologies.”
For more coverage based on industry expert insights and research, subscribe to the Diagnostic Imaging e-Newsletter here.
Mammography Study Suggests DBT-Based AI May Help Reduce Disparities with Breast Cancer Screening
December 13th 2024New research suggests that AI-powered assessment of digital breast tomosynthesis (DBT) for short-term breast cancer risk may help address racial disparities with detection and shortcomings of traditional mammography in women with dense breasts.
The Reading Room Podcast: Emerging Trends in the Radiology Workforce
February 11th 2022Richard Duszak, MD, and Mina Makary, MD, discuss a number of issues, ranging from demographic trends and NPRPs to physician burnout and medical student recruitment, that figure to impact the radiology workforce now and in the near future.