ARRS: Productivity goes beyond reading volume

VANCOUVER - Radiology practices must track their productivity to remain viable in an increasingly competitive healthcare market. But tracking productivity goes way beyond just following the number of studies your group reads, according to a presentation at this week's American Roentgen Ray Society (ARRS) meeting.

The challenge for radiology groups is to create a larger job description for its members than "film reader" -- and then create a productivity framework that will allow the group to perform at peak levels, said Dr. Frank Lexa, professor of radiology at Drexel University and professor of marketing at the Wharton School.

"Productivity can be one of the most divisive issues in radiology groups," Lexa said. "It's a tough issue to manage, much less manage well. But it's important not to measure productivity solely by cases read. We're physicians and professionals, not assembly line workers."

The baby with the bathwater?

It's easy to confuse production with productivity, Lexa told session attendees. Yet how many studies a practice reads or how many relative value units (RVUs) it earns are only part of its productivity profile: The new value chain for radiology services goes way beyond "just" reading films, according to Lexa.

Radiologists can and do offer their customers -- i.e., referring physicians, technologists, administrators, and patients -- services that reflect their training, such as the following:

  • Consultation for appropriate imaging
  • Equipment and protocol optimization
  • Personalization of imaging
  • Clinical consultation with referring physicians
  • Discussion of exam results with patients

"Focus on accuracy and service, not on who reads the fastest," Lexa told session attendees. "You hear about radiologists posting how many cases they've read that day on their Facebook pages. But we're more than just film readers."

The 'whats' and 'whys' of productivity

A better question than "How many reads can we produce?" is "What are we measuring and why?" Lexa said. Radiology practices should track their productivity because doing so can improve the overall performance of the group, foster equitability between individual workloads, and uncover ways to improve performance through better support from the practices' administrative staff, its existing technology, and its operations and workflow systems.

In addition to number of studies read and RVUs generated, practices should measure and manage the following:

  • Report accuracy via peer review
  • Referring physicians' perspectives on the value of the reports and the quality of service they receive
  • Practice efficiency
  • Workflow (volume, intensity, specialty focus)
  • Communication before and after the exam
  • Patient satisfaction
  • Other value group members generate, such as practice-building, leadership, or management

"Assess the quality of your reports in terms of timing, completeness, and how unexpected findings are communicated," Lexa said. "To gauge the accuracy of your reports, use tools like informal peer overreads, morbidity and mortality conferences, and external reviews such as RadPeer."

A radiology group tends to go through certain phases as it develops a productivity management framework, Lexa said. Most practices start with ignorance -- don't tell, don't measure -- before moving on to "peeking" at the problem (measuring haphazardly without managing the data well).

Often, groups move from this stage to anonymous reporting of productivity in the hope that outliers will fall into line; if this doesn't work, some groups try reporting productivity data with individuals' names attached, before finally settling on some form of managing the practice's productivity via a rewards system (e.g., bonuses or vacation).

But even this last approach can backfire with radiologists, who are "knowledge workers": Their main capital is knowledge, Lexa said.

"Investigate the literature on what motivates these kinds of workers before you focus on incentives," Lexa said. "Paradoxically, knowledge workers may respond to incentives with worse performance."

Bad apples or cramped apples?

To analyze its productivity, a radiology group needs to regularly review data on what was read and when, not just annual numbers. Why? Because analyzing data more often can reveal operations problems disguised as productivity problems, Lexa said. It's important to find out exactly what is contributing to low performance in a practice: Often it's not that a member is lazy, but that he or she is working in a crowded space, or has a problem with the PACS.

"Address variability issues in detail before assuming that bad behavior is the cause [of low productivity]," he said.

It can also help to use IT tools such as smart lists, Lexa said.

"Create a single list that aggregates all the radiologists' work, prioritizes it, balances it appropriately -- no 'cherry picking' -- and matches expertise and ability to tasks where appropriate," he said.

Widening the scope

The bottom line is that in addition to tracking the number of reads, radiology practices need to set base expectations and group norms using clear management techniques, according to Lexa. They should help staff master any tasks involving technology that influence workflow, analyze underlying reasons for variability between individuals, and include a wide range of contributions in the productivity list.

"Practices need to ask themselves whether they're measuring what's really important," he said.

Page 1 of 1166
Next Page