Local or regional clinical AI test beds may emerge in U.S.

Liz Carey Feature Writer Smg 2023 Headshot

AI is needed to support coordinated care pathways and precision medicine, and local or regional AI test beds are emerging, according to testimony at a March 3 U.S. Senate committee hearing into AI safety, productivity, and care.

Rad AI Chief Innovation Officer Demetri Giannikopoulos, seen as an expert in AI governance, stepped up as a primary healthcare witness. Through the lens of radiology, Giannikopoulos highlighted that access to validation data, outcomes-linked data, and a larger volume of radiology reports could significantly advance AI in the U.S.

"Extensive research has shown that errors often occur because clinicians are operating within an increasingly complex and strained healthcare system, managing rising volumes, time pressures, and expanding information," said Giannikopoulos, who is not only a healthcare AI deployment expert but also the spouse of a nurse practitioner and himself a patient living with multiple sclerosis.

The hearing comes as the radiology business, professional, and scientific community grapples with AI implementation decisions amid workforce pressures and the risk of diagnostic error. 

"Outcomes often hinge on whether a diagnosis is made quickly and accurately, because in healthcare, the most dangerous failure is not a machine failure, it is a missed or delayed diagnosis," Giannikopoulos said. "If you look at precision medicine pathways, AI has already opened those up in ways that we've never seen ... but with better integration, better understanding, better tailoring of the medicine and the understanding to the individual, it will be able to take that to the next level."

Trust factor

The key piece that needs to be built is trust, according to Giannikopoulos, whose previous work at AI software developer Aidoc involved developing clinical, operational, and governance protocols for integrating multiple U.S. Food and Drug Administration (FDA)-cleared AI medical devices into physician workflows across health systems nationwide. 

"Access to data is a critical aspect of development for artificial intelligence," he noted in a response to questions from hearing attendees. "However, in healthcare, the ability to have validation data where you can measure the quality of your solutions that are developed, by which deployers at the institutional level can assess the fit within their personal institution, is one of the greater challenges. Having more robust access to this machine-readable and translatable data will provide opportunities to actually assess these solutions not in a vacuum, but against real Americans."

The hearing was convened by the Senate Committee on Commerce, Science, and Transportation’s Subcommittee on Science, Manufacturing, and Competitiveness, and also featured Siemens Digital Industries Software Vice President Brittany Ng; Damion Shelton, PhD, cofounder and chairman of Agility Robotics; and Mark Muro, a policy expert from the Brookings Institution. 

"Accelerating AI adoption requires promoting the growth of emerging AI clusters in geographic regions," he said. "Bottom-up economic innovation in regions is an important part of the work ahead of us."

Virtually all workers will need to understand AI principles, be able to understand and direct AI effectively, and be able to evaluate its outputs, Muro added.

Radiologist tasked

For radiology practices, clinical AI adoption is rife with thorny issues -- one is that evaluating radiology AI outputs falls on radiologists. The U.S. Department of Health and Human Services (HHS) appears to support clinical AI adoption and use but has acknowledged that public trust and confidence, policy clarity, and federal incentives may be lacking.

In response to an HHS request for information, the American College of Radiology (ACR) outlined key actions for the HHS' plan to shape the path forward. They include research investments and regulatory oversight of AI-enabled medical devices and nondevices.

The HHS is exploring cooperative research and development agreements (CRADAs), a type of research and development public-private partnership to integrate AI in care delivery and create long-term market opportunities.

ACR Executive Director Dana Smetherman, MD, sent an eight-page letter on February 19 highlighting that, under current rules, AI tools are typically treated as practice expense inputs or bundled into existing Current Procedural Terminology (CPT)/Healthcare Common Procedure Coding System (HCPCS) codes, regardless of whether they materially enhance diagnostic accuracy, reduce variation, or support improved clinical outcomes.

"This approach effectively treats AI the same as commodity software, even when the technology performs clinically meaningful tasks and requires physicians, particularly radiologists, to invest additional time to interpret, validate, and incorporate AI outputs into clinical decision-making," she wrote.

Reimbursement puzzle

At RSNA 2025, Diagnostic Radiologist Eric Rubin, MD, said many AI tools lack distinct billing and payment pathways. The Centers for Medicare and Medicaid Services sets the tone, he said.

"Category 1 CPT codes are viewed by many as the optimal pathway for payment for AI services," explained Rubin, a CPT code advisor to the ACR and member of the ACR Economics Commission. "We're just now at a place where we're seeing close to universal payment for the procedure [fractional flow reserve, CPT code 75580]," for example. 

Fractional flow reserve has a very high technical component payment but only a very small professional component payment, which is the primary concern with many AI-based codes, where physicians must analyze the AI output and then put it into context relative to the rest of the findings. Most of the money goes to the technical component based upon the [relative value scale update committee] valuation process, he added.

AI can also be paid as a new technology ambulatory payment classification. Several examples of radiology AI services are paid under new technology ambulatory payment classifications, including plaque analysis, lung cancer prediction, and quantitative liver analysis, according to Rubin.

"There is a potential distinct separation of the technical work and the professional work here," Rubin explained at RSNA. "The radiologist may need to manage the outputs of multiple algorithms simultaneously for the same study. In fact, some of the work that a physician is doing with one software output may overlap with the work related to another software output."

Opportunities

The bottom line is that it takes the work of radiologists as physicians to implement AI into radiology workflows. 

"The radiology economics team is very strong and constantly thinking about this," Rubin said, adding that the committee has begun to explore the concept of a new class of CPT codes that describe the technical work of AI software algorithms that only apply to the technical work when there is no evidence of traditional physician work. 

The next possibility would be the opportunity to create dedicated physician work codes that encompass the work of radiologists to manage the AI outputs, Rubin continued, noting evaluation and management (E&M) codes, in particular. 

Generally, E&M codes are add-on HCPCS codes that capture the complexity both of patients' visits and of ongoing services for medical care. Radiologists currently use E&M codes less frequently than primary care providers.

Feedback to FDA

Tuesday's U.S. Senate committee hearing follows stakeholder feedback to the FDA in December 2025 on the issue of measuring and evaluating AI-enabled medical device performance in the real-world, another thorny issue with implementing clinical AI.

With the spectrum of AI medical devices, including software as a medical device (SaMD), software in a medical device (SiMD), AI/machine learning-enabled devices, and computer-aided devices (CADt, CADe, and CADx), the FDA's ability and willingness to stay abreast of the rapidly changing AI-enabled medical device landscape concerns healthcare data company IQVIA.

IQVIA's letter to the FDA recommended that the FDA adopt a framework emphasizing "model credibility and holistic performance." Detailed, indication-specific assessments should transition to postmarket surveillance, the company said, adding that the FDA should be given the mandate and adequate resources to review and update existing special controls in order to eliminate unwarranted barriers to the development of AI-enabled medical devices.

Also at the front of mind is the difficulty of establishing the necessary pipelines for ongoing monitoring of AI tools. Data curation for the purpose of monitoring system performance and usage over time poses an especially significant challenge, according to Javin Schefflein, MD, a neuroradiologist; David Arnold, PhD; and Cardinale Smith, MD, PhD, in a letter from Memorial Sloan Kettering Cancer Center (MSKCC).

Page 1 of 401
Next Page