Can Chat GPT accurately refer patients to IR for treatment?

Chloe Cross, MD, of Icahn School of Medicine at Mount Sinai in New York City.Chloe Cross, MD, of Icahn School of Medicine at Mount Sinai in New York City.

In interventional radiology (IR), most of us quickly become accustomed to needing to explain what we do -- to other clinicians, patients, their families, and even our own families. But how do patients get their own information about IR, if at all? When a patient receives a diagnosis, they may have multiple medical specialists offering various treatment recommendations, so they may be unsure which to follow.

If patients search for information using new tools like artificial intelligence, what is it telling them? How much does AI know about IR? Does it accurately communicate what conditions IR can help treat?

My colleagues and I, including senior author Jenanan Vairavamurthy, MD, explored this question in a study we presented at the Society of Interventional Radiology's (SIR) 2024 annual scientific meeting [the abstract was published this month in the Journal of Vascular and Interventional Radiology. We were interested in learning if the most popular AI bot, ChatGPT, refers patients to IR for treatment.

We formulated a single question that patients might pose to ChatGPT: "I have [a health condition.] What types of doctors treat this?" Ultimately, we asked about 23 different diseases and disease processes such as hemoptysis, pulmonary embolism, and thoracic aortic aneurysm. We chose health conditions for which a patient might not know what type of doctor could help -- and for which an interventional radiologist might be the best option.

Our team prompted ChatGPT three times for each condition, since it can give different answers to the same question. Its response would usually begin with a caveat before providing suggestions: "I'm not a medical doctor, but here are some types of doctors that you could consider looking to for treatment." It would then provide different specialties that can treat the condition, in a ranked list.

For a splenic artery aneurysm, interventional radiologist was its first answer each time. For abdominal aortic aneurysms, vascular surgery was most often its initial response, followed by cardiovascular surgery, with interventional radiology ranked third. However, for treatment of pulmonary embolism, which interventional radiologists often treat, IR was not mentioned at all.

Our most encouraging finding was that, among our 69 prompts, ChatGPT mentioned IR 51 times. This number struck us as surprisingly high, given how few people know much about IR. This bodes well for the likelihood of patients learning about how valuable IR can be as a treatment option and may help increase referrals to IR. In addition, for the diseases where IR was not mentioned, this knowledge may help our professional society know where to focus communications efforts and patient outreach, such as in the case of pulmonary embolism.

We were also pleased to see that ChatGPT did not fabricate responses, called hallucinations, which it has been known to do. Although that was not specifically the purpose of our research, ChatGPT suggested no specialties that do not exist or were inappropriate for the condition we queried. However, that possibility remains a serious risk for patients seeking help, which is why patients should always give the most weight to the recommendations of their clinicians.

Patients often turn to the internet -- and may increasingly turn to AI -- to determine if and when to receive care, so it's important to learn how patients may see us and whether they are getting reliable information from popular sources so the medical community can respond appropriately.

Our team is considering further research, perhaps comparing ChatGPT to other AI models or search engines – or expanding the number of disease processes we query. Longer term, we will consider publishing this work as we expand beyond this early phase.

As AI becomes more a part of our lives, its performance in providing health information will become more important to patients and clinicians alike. In a specialty like IR, where relatively few know what we do, this may be pivotal in growing the number of patients we help.

Cross is an interventional radiology resident at the Icahn School of Medicine at Mount Sinai in New York City. The comments and observations expressed are those of the author and do not necessarily reflect the opinions of AuntMinnie.com.

Check out our Interventional content area for more on the subspecialty.

Page 1 of 179
Next Page