Checklist structured reporting macros don't benefit radiology residents

Sunday, November 25 | 1:00 p.m.-1:30 p.m. | LL-INS-SU2B | Lakeside Learning Center
When checklists are used in medicine, they tend to help caregivers and increase patient safety. But their use as a reporting tool failed dismally in two radiology residency programs in New York City.

A team led by Dr. Daniel Powell, a fourth-year radiology resident at Beth Israel Medical Center, developed a structured checklist macro based on an RSNA template for a maxillofacial CT report. The objective was to see if the checklist could help decrease the rate of undetected pathology.

The research team quantified missed pathology one year before the checklist was implemented and six months afterward by residents in two different residency programs. Use of the checklist was mixed, and it had no impact on the rate of undetected pathology, as Powell will explain in this poster presentation.

With one group, there was a statistically significant increase in undetected pathology, from 16% before the checklist was implemented to 27% afterward. With the other group, there was a less pronounced increase: 17% before versus 21% after the checklist. One program required use of the checklist and had a compliance rate of 85%. Compliance was 3% at the other program, where use was not required.

Half of the residents in the combined programs reported that the macro hindered their search. Using the checklist was difficult for 61%.

Would a checklist be more useful if residents were formally trained to use it and were given a practice session? Would experienced radiologists find a checklist beneficial? Attend this presentation to learn what the study's authors plan to do next.

Page 1 of 603
Next Page