Do Medicare quality measures miss the mark in radiology?

2015 09 17 11 08 55 244 Target Miss 200

Tracking physician performance in an effort to improve healthcare quality and patient outcomes is here to stay. But do Medicare quality measures accurately track radiologist performance? Maybe not, according to a new study in the Journal of the American College of Radiology.

Researchers from the Harvey L. Neiman Health Policy Institute found that radiologists didn't perform as well as other healthcare providers on a set of quality metrics adopted by the U.S. Centers for Medicare and Medicaid Services (CMS) to measure physician performance.

But the discrepancy could be due to the nature of radiologists' work, which might imply that a different type of quality metric is needed, according to Dr. Andrew Rosenkrantz and colleagues (JACR, September 1, 2015).

Move to quality

As part of a move to replace fee-for-service healthcare in the Medicare and Medicaid systems, CMS is moving toward connecting quality measures with payment. In 2010, the agency established its Physician Compare initiative to track performance across a variety of specialties.

Dr. Andrew Rosenkrantz from the Harvey L. Neiman Health Policy Institute.Dr. Andrew Rosenkrantz from the Harvey L. Neiman Health Policy Institute.

The metrics, however, may not be accurately capturing the quality of radiologists' work. This could be because the nature of radiologists' work differs from that of other healthcare providers, in that the primary service is a diagnostic report. So it's important to assess whether performance metrics are appropriate, the authors wrote.

"In the current era of healthcare transparency, patients want to make informed healthcare decisions," Rosenkrantz told AuntMinnie.com. "Objective data regarding physician performance, as made available through public websites, could be an important way patients make those decisions, and given the role of CMS' Physician Compare in influencing decision-making in this manner, we felt it would be important to evaluate the content of this public resource as it relates to radiologists."

Rosenkrantz's group used CMS Physician Compare data for more than 900,000 healthcare providers enrolled in Medicare in early 2015. Of these, approximately 30,600 were radiologists. Provider categories included radiologists, pathologists, primary care doctors, other medical subspecialists, surgeons, all other physicians, nurse practitioners and physician assistants, and all other nonphysicians.

Metrics used to assess physician performance included the following:

  • Acceptance of Medicare-approved reimbursements as payment in full
  • Participation in Medicare's electronic prescribing incentive program
  • Participation in the Physician Quality Reporting System (PQRS)
  • Participation in the electronic health record program
  • Participation in the PQRS Maintenance of Certification (MOC) program (encourages more frequent PQRS participation than required)
  • Participation in the Million Hearts initiative, which seeks to prevent a million heart attacks and strokes by 2017

Rosenkrantz and colleagues found that radiologists came out well in measures that were specialty-specific, but they didn't do as well in general clinical metrics. They also discovered that all provider groups had low performance in some of the Physician Compare measures, which suggests that they could be poorly selected or not even widely achievable, according to the group.

Physician performance by quality metric
Metric Radiologists Nonradiologists
Accepts Medicare-approved reimbursements 75.8% 85%
Electronic prescribing 11.2% 25.1%
Participation in PQRS 60.5% 39.4%
Electronic health record 15.8% 25.4%
Participation in PQRS MOC program 4.7% 0.3%
Million Hearts initiative 0.007% 0.041%

"Radiologists actually perform extremely well relative to other providers when evaluated by specialty-specific measures, but poorly on those that are not relevant to their specialty," the authors wrote. "This finding suggests that metrics targeted to radiologists, and potentially to diagnostic information specialties in general, may be more appropriate than generic ones (such as electronic prescribing) that apply to provider groups whose work more often focuses on treatment."

Meaningful evaluation

In general, are these measures the best ones to use? Not really, according to Rosenkrantz's team.

"The six currently reported metrics are essentially process measures that do not directly track clinical outcomes ... and fail to fundamentally address the delivery of clinical care," the group wrote. "Aside from the acceptance of Medicare assignment, most categories had only a minority of providers satisfying the metric."

To meaningfully evaluate radiologists, imaging-specific metrics are needed, Rosenkrantz said. After all, using flawed metrics misleads both patients and policymakers because the measures foster a misperception that they are accurate ways to assess clinical value.

In contrast, specialty-specific metrics not only allow radiologists to fare better in evaluations, they also produce data that give patients more nuanced and informed comparisons among individual providers.

"We need to develop metrics that directly and uniquely reflect radiologists' actual work and contributions to patient care -- like exam appropriateness, radiation dose, turnaround time, and perhaps report content," Rosenkrantz said.

Page 1 of 253
Next Page