Will ultrasound and AI become the tricorder of the future?

2017 10 18 20 53 1016 Ultrasound Portable 400

Artificial intelligence (AI) will soon bring many benefits to point-of-care ultrasound (POCUS), and perhaps one day it will enable these systems to fulfill the promise of the legendary tricorder from "Star Trek." But it could also lower the bar in terms of who is able to read POCUS scans -- potentially threatening physicians.

For now, though, AI could enhance the image acquisition and interpretation of POCUS, enabling higher diagnostic certainty, better patient outcomes, and improved clinician efficiency, according to Dr. Srikar Adhikari from the University of Arizona. It could also accelerate patient care and improve workflow and throughput, he said in a session at the recent American Institute of Ultrasound in Medicine (AIUM) annual meeting.

AI algorithms could help in simple tasks such as locating standard echocardiography views, for example, or they could help advanced users obtain time-consuming measurements.

"The key is, it would have to fit into the workflow," he said. "It would have to be accessible at the bedside and also hopefully [give] quality feedback to the user."

As another example, AI algorithms could potentially be used to help assess and monitor trends in patients in shock.

"This is an ideal scenario for AI," Adhikari said.

Pros and cons

AI does have some potential drawbacks, however, including possibly reducing the skills of physicians who are interpreting ultrasound studies, according to Adhikari. Also, AI is good at detecting conditions based on the data it has been trained on; it may not be able to diagnose other conditions.

"So you can't replace the human eye completely, even with well-developed algorithms," he said.

Nonetheless, AI will be integrated into POCUS scanners on a widespread basis over the coming years, Adhikari noted.

"The key is [the algorithms] need to be validated in large populations, and they would have to be integrated into the workflow," he said. "The impact on diagnosis, efficiency, or patient outcomes remains to be seen."

Ultrasound as a tricorder?

AI could someday enable POCUS to function essentially as the tricorder in the science fiction series "Star Trek," serving as an ultimate diagnostic tool without the need for an image display, according to Dr. Michael Blaivas, a professor of medicine at the University of South Carolina.

With the ability to make a diagnosis simply by waving a transducer or sensing device over the patient, anybody could do it.

"It doesn't have to be Dr. McCoy; it could perhaps be somebody else with minimal technical training," Blaivas said.

Several companies are already using AI algorithms in their echocardiography systems, he said. A deep-learning algorithm has also been developed to segment and then calculate intima-media thickness (IMT) in carotid arteries. There's also been work to segment nerves on ultrasound images.

"Wouldn't it be convenient if the nerve was outlined by your ultrasound machine, and you could be told exactly where to stick the needle and inject anesthetic?" Blaivas said. "That's actually already being done. There are several [convolutional neural networks] that have been developed and used in various software."

Automated analysis of the inferior vena cava (IVC) is also available. In a presentation at Innovation in Medicine and Healthcare 2014, researchers reported that such analysis of the IVC performed well, moving the field toward automated measurement, according to Blaivas.

AI also can help count B lines on lung ultrasound images. In a 2013 study, researchers created and tested an algorithm for automating B line scoring. In a comparison with two expert reviewers, the algorithm matched the average expert score in 90% of cases. This capability would be very important for less-experienced users, Blaivas said.

In this way, AI could enable ultrasound to be performed by personnel who aren't experts -- perhaps even by people without healthcare training. For example, ultrasound technology developer Butterfly Network recently introduced a personal ultrasound device that allows novices to perform ultrasound examinations with the assistance of AI.

"[A novice] can get reasonable-quality images that could be diagnostic," he said. "The beauty is, you can also have an AI assessment telling you what the ejection fraction is, let's say ... [on this device, AI] not only tells you where to align the beam, where to put the transducer, how to adjust and troubleshoot your image, but also then finally gives you an assessment of ejection fraction."

Assessing the sickest patients

With such advances in AI for lung ultrasound, IVC assessment, and cardiac ultrasound, AI-powered ultrasound can now perform a complete assessment of the sickest patients, Blaivas said. This could be particularly valuable given the long patient wait times often experienced in emergency departments. During particularly busy periods, patients can even die in waiting rooms without being examined.

"Who dies in waiting rooms in the emergency department? People with respiratory issues, cardiac issues, blood loss, let's say, in the belly," he said. "All of the AI algorithms we've talked about so far are perfect for evaluating somebody's respiratory status, cardiac [status], and obviously also for the possibility of blood loss. So this is a great chance to put a suite [of AI algorithms] together for anybody to use."

These clinical AI algorithms could be used anywhere an expert imaging diagnosis would be helpful but is unlikely to be available, according to Blaivas. This might include rapid response teams and nurses in the emergency department lobby, but it could also include staff in the lobby who could check the sickest patients to make sure they don't have a pneumothorax or myocardial infarction, for example.

In the future, physicians may not need to be involved in the ultrasound study at all, Blaivas said.

"As we drive this technology to become more automated, perhaps experts on ultrasound and point-of-care ultrasound aren't really relevant anymore because anybody can do ultrasound," he said. "That sounds a little scary and a little bit silly, but there are probably a whole group of people that made their living some time ago looking at blood smears, analyzing different blood tests that are now done by a machine -- perhaps in minutes."

It will take a long time, though, for this future vision to come into practice, he noted.

At-home patient monitoring represents another big opportunity for AI-powered ultrasound, but it will likely require the addition of 3D/4D cardiac imaging capabilities to these systems, he said. In the future, a parent could also perform an AI-directed ultrasound on a sick child at home, enabling faster diagnosis of pneumonia.

"One thing I've seen this flu season more than any other time is that waiting an extra 12 hours to diagnose somebody's pneumonia can sometimes mean the difference between life and death," Blaivas said. "So earlier diagnosis can be a very good thing."

A pain point

With deep learning and AI, the clinical skill required to make diagnoses on ultrasound images in the future will be minimal, Blaivas said.

"To a lot of us, this will be a pain point," he said. "Even now, I struggle with the fact that some of the things I used to do are being done by people at lower levels. I think that will get much worse."

Training requirements for interpreting ultrasound images will also decrease, perhaps becoming close to nothing, he said.

"Are we possibly seeing the seeds of our own extinction here? Maybe, or maybe just a change in job classifications," he said. "But change can't be avoided, and in this case, benefits will be seen by our patients to whom we really kind of owe everything."

Page 1 of 507
Next Page