Interruptions are the bane of a radiologist's workday and can lead to interpretation errors. A chatbot based on artificial intelligence (AI) aims to reduce these disruptions while also providing information quickly to both clinicians and patients.
The chatbot mobile app provides evidence-based answers to routine radiology questions from clinicians, physicians, other radiologists, and even patients. The process begins with a simple text message or spoken request, followed by a reply with the relevant information -- or additional questions to hone in on the inquiry -- in a conversational style.
"We want radiologists to be interrupted less and to be able to focus on interpretation, and we want physicians to get the information they need super quickly," said Dr. Kevin Seals, a radiology resident at Ronald Reagan UCLA Medical Center. "We also want hospitals to have more evidence-based, high-quality information that they can circulate to their ecosystem."
The inspiration for the chatbot came from a common problem that all radiologists face: phone calls, questions, and other disruptions that create speed bumps in workflow and delays in patient care.
"To be comprehensive and careful with imaging, you have to be very systematic and thoughtful," Seals told attendees at an AI conference in Boston this month hosted by the Insight Exchange Network (IEN). "But one thing that comes up a lot is interruptions, which can be disruptive to the radiologist's workflow. You might go back to [the image] and forget where you are. There is data that shows [interruptions] could result in a number of errors."
Conversely, the chatbot is also designed to help physicians, surgeons, and other caregivers who often need to contact a radiologist with specific expertise.
"Maybe they need a brain radiologist, but they only have the phone number for a chest radiologist," Seals said. "So getting the correct radiologist on the phone can be a problem for them as well."
To make the most out of the technology's capabilities, Seals took an informal survey of physicians at the University of California, Los Angeles (UCLA) to determine the tasks provided by radiologists that could potentially be automated in a chatbot.
Clinicians most frequently wanted to know which imaging modality would be best for a certain patient situation. For example, if a patient has an abscess in the abdomen, should the physician order an MRI or CT scan?
"The next issue was with contrast," Seals said. "Can a patient with normal kidney function get an MRI or CT scan with contrast? The next issue was a matter of timing: My patient has this metallic implant in their spine. Can I do an MRI?"
With the chatbot, people's questions are processed by the neural network. For example, someone could ask: "In a patient who has an estimated glomerular filtration rate of 4.4, is there sufficient kidney function to receive a CT scan with contrast?" The user is then plugged into an information module that advises him or her on what to do.
Seals and chatbot co-developer Dr. Edward Lee, PhD, from UCLA's David Geffen School of Medicine, showcased the chatbot -- also known as RadChat -- at last year's Society of Interventional Radiology (SIR) annual meeting. At that time, the chatbot was designed for radiologists. Since then, the app has become an AI tool for interventional radiologists and physicians involved in interventional procedures. It's also suitable for patients.
When a patient has questions for his or her physician, the chatbot can simulate the initial consultation process and provide answers. There is also a research chatbot for academic physicians in a hospital.
"This chatbot simulates the initial consultation with an interventional radiologist, helping answer some of the questions a patient might have," Seals said. "It is the first level of information prior to meeting with the physician. This conversational chat interface is intuitive; it is easy for nearly every patient to use."
In addition, because the chatbot is a mobile app, the patient's location is available and he or she could be connected with a local physician who can perform the treatment.
Like many AI algorithms, the chatbot's database collects the inquiries and places them in a corresponding category. This has two important results, Seals explained.
"One, it allows us to generate new training data to make the chatbot smarter, and two, it helps us identify where there are holes in the chatbot that need to be corrected," he said.
Naturally, there are cases where automation is not appropriate and the user must speak directly with a radiologist. "In those cases, the chatbot makes it very easy to get a human radiologist on the phone," he said.
In the future, Seals and colleagues envision expanding the chatbot's functions to include cardiology and neurosurgery.