February 26, 2021 -- Artificial intelligence (AI) has struggled to gain a foothold in radiology over the past five years or so, achieving fairly limited success. Much of the slow acceptance is related to pushback by radiologists and others to early messaging that AI is or will soon be better than a radiologist and will ultimately take their place.
It should come as no surprise then that radiologists have not given their unbridled support to a technology that was purported to put them out of business. Instead, radiologists mounted a full-on counterattack worldwide in forums, blogs, articles, and on social media against the narrative that AI alone is the future of radiology.
At the same time, radiology has also had to deal with an assault from nurse practitioners wanting a piece of the radiology pie as well. With attacks seemingly coming in from every side, it would be an understatement to say radiologists are at a DEFCON 1 level to defend their livelihood.
Hundreds of studies have been published that report AI's performance to be equal to or better than radiologists, and even those that are pro-radiologist -- such as this review in European Radiology Experimental -- often contain comments, such as the one quoted below, that reinforce the concerns many radiologists have.
"The key point is that AI has the potential to replace many of the routine detection, characterization and quantification tasks currently performed by radiologists using cognitive ability, as well as to accomplish the integration of data mining of electronic medical records in the process."
Wider acceptance of AI?
Some in the AI community will point to a wider embrace of the technology after the U.S. Centers for Medicare and Medicaid Services (CMS) in 2020 approved a new technology add-on payment (NTAP) in 2020 for AI software used to detect stroke on CT scans. This program, which lasts a maximum of three years, pays up to $1,040 per use of approved algorithms.
But here's the kicker for the NTAP program. Besides being new technology, "the standard [diagnosis-related group] rate must be deemed insufficient to cover the costs of implementing the service or technology," according to CMS. To me, that indicates CMS had two choices: Tell AI vendors to lower their price or underwrite its use. It seems CMS chose option 2.
The third requirement is that "the service or technology must demonstrate substantial clinical improvement over existing services or technologies." Now I have no doubt whatsoever that AI can be a positive force in assisting with stroke protocols. That said, is running an AI algorithm on the study first a "substantial clinical improvement" over having a radiologist just reading the study quicker?
If AI reads the study as a positive for a stroke, a radiologist still needs to validate the AI's findings ... unless you are going solely with AI as the sole determinant, which is light years away from happening. That may be the ultimate goal, but it isn't anywhere close to the reality today.
Is that worth up to $1,000 per use? Seconds count with a stroke protocol, and I would concur. But there are protocols that have already been defined and used for years that minimize delays in stroke management.
For example, a paper published in the Journal of Neurointerventional Surgery defines the use of a stroke code team activation (aka a "brain attack response team") to address the patient while they are in the CT scanner and in some cases even before. These teams consist of stroke nursing coordinators, emergency room physicians, stroke neurologists, radiologists, and CT and/or MR technologists.
The paper also defines the use of an emergent large vessel occlusion (ELVO) team as consisting of a neurointerventional surgeon, interventional technologist, nurse, and in some cases, the anesthesiologist. This ELVO team works with the brain attack team to get the patient into the interventional suite quickly and have the procedure started in under one hour.
So with teams in place as the scan is being done, a diagnosis made, and treatment plan initiated, where is the added value of AI? Well, not everyone has stroke code and ELVO teams, so in situations like this -- especially in smaller regional facilities -- there is value to AI.
But sometimes a simple thing like notifying the radiologist in advance that a stroke assessment is coming their way will make a difference. And the AI's findings could be attached to the study to help validate the results. This definitely provides an added value, but is it worth up to $1,000 per study provided by NTAP? That's up for debate.
Now am I saying that AI has limited value and that we should do without it? On the contrary, I actually support and suggest that every radiology study be augmented by AI. This complementary input and/or second opinion can provide the radiologists with added value, provided the cost isn't excessive. But defining the cost/value relationship is one of AI's many challenges.
Too much money?
Few will argue that bringing a product like AI to market costs lots of money, often millions of dollars. But how much is too much?
Well, over $1 billion has been invested in AI in medical imaging to date, although in fairness, nearly half of this went to one AI company in the cardiovascular marketplace. That still leaves a cruise ship's worth of money that has been invested in AI. Over a dozen companies alone have received over $25 million in investment, and several are on their second, third, and even fourth round of venture capital (VC) investments, accruing $50 million or more in total investment.
If the market had a dozen players or less, it might still be a slight challenge for vendors to show a return on investment (ROI) in a timely fashion. At last count, though, the medical imaging AI market has over 150 vendors, and over 100 of these have products cleared by the U.S. Food and Drug Administration. All of these firms are vying for the same small market segment. Few of the companies are breaking even, let alone making money. The reality is that most vendors will be hard-pressed to stay alive in the long term.
Will AI companies ever make money? Yes, but how much and when is the question, especially given the burn rate and debt load most AI companies are currently carrying. NTAP is already augmenting some payments to end users and stimulating sales to a select few AI companies. But that is only for a limited time. With an ROI model that is built heavily on radiologist time savings and is very light on financial advantages to the institution, AI tends to appeal most to facilities where the radiologists are employees.
VC firms had hoped and were told the market would be developed by now. However, the market is still at least three to five years away from showing a significant ROI. Like everywhere else, there are exceptions. But generally speaking, this is the way it is.
Can AI save itself?
AI has a lot to overcome. The first thing it needs to do is have a clear industry-wide definition of what AI is and isn't. Most, but certainly not all, of what has been called AI to date is closer to computer-aided detection (CAD) than true AI.
True AI, at least from a clinical perspective, ideally includes data analysis of the patient's medical data, leading to a preliminary diagnosis. This is crucial given the limited time a radiologist has to review a study.
The article referenced earlier in European Radiology Experimental clearly defines this issue:
"The increasing amount of data to be processed can influence how radiologists interpret images: from inference to merely detection and description. When too much time is taken for image analysis, the time for evaluating clinical and laboratory contexts is squeezed. The radiologist is reduced to being only an image analyst. The clinical interpretation of the findings is left to other physicians. This is dangerous, not only for radiologists but also for patients: non-radiologists can have a full understanding of the clinical situation but do not have the radiological knowledge. In other words, if radiologists do not have the time for clinical judgment, the final meaning of radiological examinations will be left to non-experts in medical imaging ... In this scenario, AI is not a threat to radiology. It is indeed a tremendous opportunity for its improvement."
Rethinking AI's role
Rethinking AI's role is crucial to its success and industry-wide acceptance. Several studies have recently found that AI and even a few non-AI tools can identify masses before they are picked up by a radiologist. That is a plus. What it doesn't do is make AI better than the radiologist. It is certainly a complement, as it can discern grayscale levels better than the naked eye. AI is not a replacement, though, just as interpreting the image alone is a small part of the diagnostic process. Clinical data must be assessed as well.
AI companies will hit a home run when three things happen:
If there is one fundamental thing that AI companies and their PR and marketing firms need to understand, it is that while AI can and does often improve the diagnostic process and workflows, it cannot and should not take away the decision from the radiologist.
Radiologists are the ultimate gatekeepers of patient care from an imaging perspective and bear the responsibility to provide the very best care possible. AI can and should be an adjunct to that care. The industry must recognize and position AI's role as a complementary -- not competitive -- tool for radiologists.
Michael J. Cannavo is known industry-wide as the PACSman. After several decades as an independent PACS consultant, he worked as both a strategic accounts manager and solutions architect with two major PACS vendors. He has now made it back safely from the dark side and is sharing his observations.
His healthcare consulting services for end users include PACS optimization services, system upgrade and proposal reviews, contract reviews, and other areas. The PACSman is also working with imaging and IT vendors developing market-focused messaging as well as sales training programs. He can be reached at email@example.com or by phone at 407-359-0191.
The comments and observations expressed are those of the author and do not necessarily reflect the opinions of AuntMinnie.com.
Quote from jordy78
As always, insightful and thought-provoking.
My .04 cents:
- AI=Cheaper and faster prediction machines.
- Clinicians=Those who make judgments on information (including from prediction machines).
- Radiologists who "embrace AI"=Remain the clinicians making the judgments in the future.
- Radiologists who "don't embrace AI"= Will likely not be the clinicians making the judgments in the future.