By Kate Madden Yee, AuntMinnie.com staff writer

October 1, 2013 -- Referring physicians asked to rate the usefulness of a set of sample radiology reports gave them relatively high marks, according to a new study published in the October American Journal of Roentgenology. But the referring physicians added that they would like to hear from radiologists more often about important findings.

Radiologists' reports are the primary way they communicate findings to both referring physicians and patients, and radiologists have worked hard to improve their reporting practices, particularly through peer review, wrote lead author Dr. Andrew Gunn and colleagues from Massachusetts General Hospital.

The most common type of radiology peer review is to assess how well two radiologists blinded to each other's work agree on a case -- whether an abnormality was perceived, interpreted, and reported correctly.

Dr. Andrew Gunn
Dr. Andrew Gunn from Massachusetts General Hospital.

But there's still room for improvement in radiology reports, especially because radiologists aren't the end users of the finished product: referring physicians are. A formal peer-review process that included structured feedback from these doctors could improve reporting practices even more, Gunn's team hypothesized (AJR, October 2013, Vol. 201:4, pp. 853-857).

"Every radiologist has received a phone call or an email from a referring physician saying there's something wrong with the report, like a typo, or even the clinical question wasn't answered," Gunn told AuntMinnie.com. "We give each other radiologist-to-radiologist feedback, but what's missing is input from referring physicians and patients."

For the study, the researchers invited five referring physicians -- all primary care providers -- to participate as reviewers. Reports from abdominal CT, chest CT, brain MRI, and abdominal ultrasound exams acquired in September 2010 were considered appropriate for review if the indication was "abdominal pain," "shortness of breath," "headache," or "pain," respectively. They did not include reports that were from normal or follow-up exams.

Gunn and colleagues then selected at random 12 reports from each of these categories (for a total of 48) and distributed them to the five reviewing physicians, along with the clinical scenario and an evaluation form for each exam.

On a scale of 1 to 5 (1 being "not useful" and 5 being "very useful"), the reports were found to be clinically useful (average, 3.8), allowing for good confidence in clinical decision-making (average, 3.7). The referring physicians cited the following as the most common problems:

  • Unclear language (15%)
  • Typographical errors (12.1%)
  • Not answering the clinical question (7.9%)

Of the reports, 35.4% contained recommendations for further diagnosis or treatment, and 84.7% of these recommendations were deemed clinically appropriate by the referring physicians. Of the 64.6% of reports that did not have recommendations for further diagnosis or treatment, the physicians felt that 31% should have included a recommendation, the authors wrote. In addition, a third (31.2%) of reports that did not made clear whether the results had been directly communicated to the ordering providers had results that should have been communicated, according to the reviewers.

"A key piece of feedback we got from these referring physicians is that they would have liked to be notified about results more often," Gunn told AuntMinnie.com. "The comment was something like, 'You wouldn't believe the nonsense I get paged about, but important findings don't come through.' So that told us that radiologists should be calling and paging more often. It's good for patients, and it shows referring doctors that someone's reading the scan, someone is paying attention."

The detection and interpretation of abnormalities is the responsibility of the radiologist, but good reporting is a multifaceted process that includes not only documentation of abnormal findings, but also putting these findings in context so that the referring physician can effectively treat the patient. But what if there are terms or practices in radiology that referring physicians aren't familiar with?

"Such items are unlikely to be identified during a radiologist-to-radiologist peer-review process even though they may cause significant angst among referring physicians," Gunn and colleagues wrote.

Including feedback from referring physicians in the process could improve reporting practices in radiology and, at the same time, improve communication and camaraderie between radiologists and referring physicians, according to the authors.

How could this be accomplished? One idea is to develop a Web-based evaluation that the referring physician could choose to fill out at the time of reading the report on the electronic medical record, Gunn's group suggested. Of course, it's unclear whether referring physicians would actually take the time to provide structured feedback on a recurring basis if given the opportunity, and the system would need to be designed so that reports would only be evaluated if they contained errors.

But such a system would have the benefits of anonymization, random selection of reports for review, reduced bias in the selection of reviewers, and physician-specific data that could be collected continuously or on a periodic basis, Gunn and colleagues wrote.

"The primary care doctors appreciated that we reached out to them, that we're aware there could be communication issues in radiology reports, and that we want to improve the service we provide," Gunn said.

Integrating imaging informatics in a health system: Part 2
In this second part of a series on integrating imaging informatics in a health system, AuntMinnie.com describes the experience of the Yale New...
Integrating imaging informatics in a health system: Part 1
In the first of a two-part series on integrating imaging informatics in a health system, AuntMinnie.com presents the experience of the North...
Software yields 3D visual summaries of radiology reports
Medical imaging made a major leap forward when it moved from 2D images to 3D renderings. But the radiology department's primary product -- the radiology...
How to take action against radiology threats: Part 1
In this two-part series, Dr. David C. Levin and Dr. Vijay Rao of Thomas Jefferson University outline actions that radiologists can take to address threats...
Most radiologists don't meet Medicare quality reporting rules
More than three-quarters of radiologists are not meeting requirements of the Medicare Physician Quality Reporting System, according to a study to be published...

Copyright © 2013 AuntMinnie.com

Last Updated np 10/1/2013 5:10:26 PM

20 comments so far ...
10/1/2013 9:37:06 AM
bigskyrad
It is sad that a study from a respected academic institution on the clinical utility of radiology reports would ignore a basic fact that communication works best when at least two individuals are actually involved. The proper two-way communication in the situation studied is supposed to start with the x-ray order. Referring physicians are often grotesquely negligent in their responsibility to communicate relevant clinical information on the order. A "history" of "abdominal pain" is what I would expect from my house cleaner. A proper history on an x-ray order of, say, "subacute RUQ pain and tenderness, fever, leukocytosis" would much more likely result in an ultrasound or CT report that emphasized clinically relevant findings. A single word "history" of "dyspnea" on the order for chest radiography may result in a normal report, while a clinical indication for that same study of "slowly progressive dyspnea - hx of scleroderma" would often result in a carefully worded discussion on the presence or absence of (often subtle) interstitial lung disease and perhaps a recommendation for high resolution pulmonary CT.
 
Medical students should be taught that the x-ray order is a clinical tool that should be used properly. I illustrate the proper use of this clinical tool in my most recent book, "Practical Radiology - A Symptom Based Approach," published earlier this year by F.A.Davis

10/1/2013 12:45:20 PM
Rolf Rad
I am curious - are you able to get your clinicians to comply? If you do, you live in a different world than I do. 
 
 
I would love it if I could get this, but gave up years ago. 
 
In certain situations, I insist on calling the referring doc and discussing the patient with him or her directly. SO much more information. Of course, some don't return calls, etc. 
 
This makes me very cynical about the quality of care that we give. I try, but it seems that the patients doctors don't think it important enough to talk to me. 

10/1/2013 1:00:18 PM
expertrad
I definitely agree with bigskyrad.
Somehow this service industry mentality is so pervasive.
It hinders our speciality form standing on its feet and demanding certain pre-requisites in the clinical order.
I think there is an idea there for any research minded person to look at the improvement in quality with some intervention.
  
Quote from bigskyrad


It is sad that a study from a respected academic institution on the clinical utility of radiology reports would ignore a basic fact that communication works best when at least two individuals are actually involved. The proper two-way communication in the situation studied is supposed to start with the x-ray order. Referring physicians are often grotesquely negligent in their responsibility to communicate relevant clinical information on the order. A "history" of "abdominal pain" is what I would expect from my house cleaner. A proper history on an x-ray order of, say, "subacute RUQ pain and tenderness, fever, leukocytosis" would much more likely result in an ultrasound or CT report that emphasized clinically relevant findings. A single word "history" of "dyspnea" on the order for chest radiography may result in a normal report, while a clinical indication for that same study of "slowly progressive dyspnea - hx of scleroderma" would often result in a carefully worded discussion on the presence or absence of (often subtle) interstitial lung disease and perhaps a recommendation for high resolution pulmonary CT.

Medical students should be taught that the x-ray order is a clinical tool that should be used properly. I illustrate the proper use of this clinical tool in my most recent book, "Practical Radiology - A Symptom Based Approach," published earlier this year by F.A.Davis



10/1/2013 2:05:30 PM
IGotKids2Feed
Quote from expertrad


I think there is an idea there for any research minded person to look at the improvement in quality with some intervention.



I'll take it one step further and throw out a rad research idea (you're welcome):

Develop software that automatically mines the chart data and automatically enters the key clinical information for the particular rad study being ordered.

10/1/2013 2:49:22 PM
kcrad
Unfortunately the biggest nuggets of data in the chart are often in the radiology report