Decision-support tool slashes inappropriate heart imaging

The use of physician decision-support software based on appropriateness criteria eliminated three-fourths of inappropriate imaging exams for evaluating coronary artery disease (CAD), according to a study published online May 21 in the Journal of the American College of Cardiology.

In a prospective, multicenter study conducted over eight months, the use of a decision-support tool at the time of physician ordering led to a more than 75% reduction in the number of inappropriately ordered imaging studies and a greater than 20% increase in appropriate imaging studies. Increases in medical therapy were also associated with decreases in inappropriate imaging tests.

"The possibility here is that this immediate feedback through the physician-support tool, at a time when a different test can be ordered, can serve as a very useful adjunct or replace prior authorization procedures in a manner that is physician-preferred rather than policy-based," senior author Dr. James Min, from Cedars-Sinai Heart Institute, told AuntMinnie.com.

Prior authorization

With insurance companies increasingly requiring prior authorization by radiology benefits managers (RBMs) for advanced cardiovascular imaging tests, the researchers wanted to identify a method that would encourage the use of appropriate testing in a manner that physicians preferred, Min said.

Specifically for this research project, software developer MDDX developed a Web-based decision-support tool to provide immediate feedback to referring physicians at the point of order regarding the appropriateness of the imaging studies. The automated software was based on the American College of Cardiology (ACC) Appropriate Use Criteria for myocardial perfusion scintigraphy, stress echocardiography, and coronary CT angiography (CCTA).

All appropriate-use criteria indications for CAD evaluation were included using a tree-and-node algorithm, which assigned a level of appropriateness of CAD testing for patients based on modality type. Other elements included CAD risk factors, level of pretest likelihood or risk of CAD, clinical presentation, and specific clinical scenario, according to the researchers.

If the algorithm judged the tests to be nonappropriate (uncertain, inappropriate, or not addressed by appropriate-use criteria), the ordering physician was prompted and allowed to provide a reason for disagreeing with the rating.

Exempted from RBM requirements

The researchers performed a prospective, multicenter study to collect appropriate-use criteria ratings for consecutive, noninvasive imaging tests for CAD covered by a single payor (United HealthCare) for non-Medicare subjects. Data were collected from three single-specialty cardiology sites in the St. Louis metropolitan area between June 2010 and January 2011 (JACC, May 21, 2013).

During the study, all participating physicians were exempted by United HealthCare from the normal imaging prior authorization requirements from RBMs. Physicians were able to order imaging studies regardless of the level of appropriateness determined by the system.

Imaging orders were a two-step process during the study. After initial ordering, physicians had to return to the decision-support tool to report the findings of the test. This final step completed the approval code necessary to submit the study for reimbursement, Min said.

In all, 100 physicians at the three cardiology practices participated in the study, and the system was used to order imaging studies for 472 patients. Of the imaging studies ordered, 72% were myocardial perfusion scintigraphy, 24% were stress echocardiography, and 5% were CCTA.

Impact of decision-support software on physician ordering
  Imaging studies ordered in 1st 2 months of study Imaging studies ordered in final 2 months of study p-value
Appropriate 49% 61% 0.02
Inappropriate 22% 6% 0.0001

Changes in intended medical therapy based on imaging studies also increased from 11% to 32%.

"[These results] contrast with prior studies that looked at single modalities and tried to identify methods by which they could improve appropriate use of testing," he said. "Those methods have included medical grand rounds or postcards or emails, and have almost been uniformly unsuccessful at altering physician behavior as it relates to appropriate-use criteria."

In other findings, it took an average of approximately 137 seconds for the system to provide users with results.

"It didn't take very long [for the physicians] to perform, and over the course of the study, they became more familiar with the tool. At the conclusion of the study, the average time to complete the appropriateness-use decision-support tool was less than a minute," he said.

The researchers are now considering whether to pursue a larger-scale study "that would be sufficiently convincing, so that more payors can take up this approach to encourage physician-preferred imaging decision support," Min said.

Page 1 of 603
Next Page