ABR and RRC activities: radiology's regulators talk

Training radiologists -- a job that includes accrediting residency programs, designing subspecialties, testing competencies, and licensing practitioners -- isn't for slackers. It requires solid decision-making in matters as fluid as clinical practice and staffing levels. It demands diplomacy, a delicate balancing of turf, money, and egos. And its decisions must ultimately serve the cause of patient care.

Fortunately, those who till the soil of radiology's future seem to take the job seriously. At the 2001 RSNA meeting in Chicago, the leaders of diagnostic radiology's Residency Review Committee and American Board of Radiology held a wide-ranging discussion that emphasized the need to stay in touch with practitioners.

"When you look at the things we do, just remember we're the same people as you are," said Dr. Ronald Zagoria. "I'm a practicing radiologist at Wake Forest University, I'm busy, I go to work every day, I read films, I see patients, I train residents, so I know what everybody's going through."

Zagoria is also chairman of diagnostic radiology's Residency Review Committee (RRC), a body with final decision-making authority in the workings of some 204 diagnostic radiology residency programs in the U.S. Under the auspices of the Accreditation Council for Graduate Medical Education (www.acgme.org), the RRC's nine radiologists and one resident meet twice a year to make the accreditation decisions that shape diagnostic radiology programs.

Presenting a categorical course at the RSNA meeting, and in a later interview with AuntMinnie.com, Zagoria discussed the RRC's responsibilities, and some of its recent accomplishments.

"We approve meritorious requests for increases or decreases in the number of residents a program has, we evaluate requirements for different programs like diagnostic radiology and abdominal imaging, we respond to requests from the field for new subspecialties, and we also look at other specialties like surgery and medicine and their requirements," he said.

Over the past year the committee has also spent considerable time reviewing accreditations for radiology training programs. The concerns have often focused on growing staffing shortages, which have made it difficult for some programs to retain the required 1:1 faculty-to-resident ratio.

"Right now directors are shorthanded, and they're afraid they don't have enough (faculty) to justify the number of residents that they have, and when they come up for review we're going to say you have to have fewer residents," Zagoria said. "It adds to our concerns that this will add to the shortage of radiologists. So we want to be very careful about that."

But while acknowledging the concerns of directors, the RRC voted to maintain the 1:1 ratio. The RRC can work with temporary faculty shortages in some cases, Zagoria said, as long as the evidence indicates that residents are doing well.

"Our job is to ensure that these programs are of educational value, they're not just programs where you can go and work for four years with no supervision and then say you're finished with an accredited program," Zagoria said. Most programs meet the 50% first-time test taker pass rate, he said. For those that don't, a high failure rate is often indicative of systemic problems.

"You may be able to gloss over a lot of problems with your paperwork, but this is something you can't gloss over," he said. "If people aren't passing the boards, something is wrong with their educational environment, I think, particularly if it's a chronic problem."

The RRC has also reviewed proposals for several new subspecialties, with significant progress in a new program for cardiothoracic radiology. The board has evaluated the proposed program and allowed the group to proceed through the ACGME. In a year or two, cardiothoracic radiology could be a new accredited subspecialty, he said.

"Breast imaging, they're not quite as far along, but they're proposing a program that includes mammography, ultrasound, MRI, nuclear medicine, and interventional radiology of the breast," Zagoria said. "We evaluated the concept, what kind of impact it would have, the general guidelines. We thought it looked fine, no problems, and they'll make a formal proposal to us whenever they want to."

The board is also considering a new two-year neuroradiology program that could be offered along with the current one-year option. If accepted, any institution would be able to offer one or both programs, he said.

Why bother with accreditation? Accreditation in diagnostic radiology lends a certain stamp of quality or approval to a program, as well as the possibility of reimbursement by Medicare or the hospital, Zagoria said..

"If they are accredited they have to be teaching programs, versus if they're unaccredited you can do whatever you want with the fellows -- you can have them working, they can be generating income. Whereas with the accredited programs you can't really do that; they have to be trainees, they have to be supervised."

Some programs that include a clinical year hope to be accredited under the radiology banner in order to facilitate funding by the Centers for Medicare and Medicaid Services (CMS), the U.S. agency that administers Medicare. The idea seems reasonable, he said, so some programs are going to be accredited for five full years instead of four, as they are now.

Also ahead, the American Board of Medical Specialties, which oversees the American Board of Radiology (ABR), has recently added six general competency requirements to each of the 26 specialties it oversees, including radiology. According to the ABMS, the specialties must already have begun to incorporate "relevant and germane" aspects of the following competencies into all of their residency programs:

  • Medical knowledge.

  • Patient care.

  • Interpersonal and communication skills.

  • Professionalism.

  • Practice-based learning and improvement.

  • Systems-based practice.

The specialties are crafting their particular versions of competency testing through quadrads, or groups of four representatives from each specialty, which together form the Quadrad Commission. The diagnostic radiology quadrad, led by ABR executive director Paul Capp, has been working since early 2001 to customize the general competencies for radiology. The competencies will be both tested and scored by the ABR, Zagoria said.

"The general competencies are a big concern, but for program directors, but I assure you, this is going to work out just fine. We're not going to be overly concerned if you say you're instituting a program, but it's not (yet) fully developed." And it will be a long time, he said, before any residents fail their accreditation based on not having met general competency requirements.

The ACGME recently instituted the Parker J. Palmer Award to honor 10 outstanding program directors each year. More information is available at the ACGME's Web site.

The American Board of Radiology

The ABR’s mission statement says it all, according to ABR president Robert Hattery, who drove home the point during his presentation by reading the text aloud.

"The mission of the American Board of Radiology is to serve the public and the medical profession by certifying that its diplomates have acquired, demonstrated, and maintained a requisite standard of knowledge, skill, and understanding essential to the practice of radiology, radiation oncology, and medical physics."

To prepare for the huge task ahead, the ABR has leased a 30,000 square foot building in Tucson, AZ, which opened last October. The organization will use about half the building and lease out the rest. Most of the occupied space is dedicated to the computer-based testing center, the main headquarters for a computerized testing program that is constantly evolving as the ABR consults with radiology organizations and other medical specialty boards, Hattery said.

"As we become more computer-centered, we have to have psychometric data that underpins the validity of what we do," Hattery said. "We're working hard on psychometric analysis of the oral exam. As you know, there are a lot of psychometrics for the written exam, but they are changing as we put it together."

Meanwhile, the staff has expanded to administer the oral and written testing, which will eventually be given in satellite centers throughout the U.S. as well as in Tucson. The testing process is used to evaluate:

  • Professional standing.

  • Evidence of commitment to life-long learning.

  • Involvement and self-assessment.

  • Evidence of cognitive expertise.

  • Evidence of evaluation of performance in practice.

"The last one is difficult to assess, to have a metric implementation," Hattery said. There is a way to do it, he said, but none of the medical specialties are ready to quantitatively evaluate clinical performance just yet.

The new general competencies tests present similar difficulties, he said. The goal is to someday be able to not only measure them, but to accurately assess improvement over years of practice.

ABR executive director Paul Capp used his time during his RSNA talk to review a study assessing trends in residents' performance on the exams over the past 10 years. Overall, residents are taking the tests earlier, he said, with the great majority opting to take the physics exam in the second and third years. Most are waiting to take the clinical exam by itself in the fourth year.

Since 1998, administrators have seen a high failure rate for the physics portion of the exam, but in 2001 the situation improved significantly, he said. The oral exam pass rate hasn't changed over the last 10 years, nor has the failure rate for first-time takers.

"The quality of performance decreases when residents take both (the written and oral) exams during the fourth year, the most likely result of marginal candidates being uncertain of their performance and waiting to take both exams during the fourth year. If a candidate has passed the written by the third or fourth year, there is no measurable difference in the oral examination and performance. So whether a candidate passes the written exam in clinical or passes the single exam in the fourth year makes no difference with regard to their oral performance."

Between 1985 and 1990, 95% of residents passed their examinations; in 2001 92% passed. "So we think the exam process is on solid ground," Capp said.

Regarding the fourth year of residency, an audience member called it a well-known waste of time. Very little training occurs in the fourth year, he said, because residents are uninterested in learning anything but how to pass the oral board exams -- which could be delayed until after a year of fellowship.

Capp responded that in his 20-plus years on the board, perhaps no other question has generated more discussion and calls for change. Yet there are pluses and minuses on both sides of the equation, he said.

"On the one hand, those of us who run training programs know that the fourth year is a complicated year, with the residents perhaps not learning to be radiologists as much as hitting the books and looking at cases and trying to prepare for the oral exam," he said.

"On the other hand, the subspecialty programs are vehemently against delaying the oral examination, because if you take (for example a fellowship) in neuroradiology, at the end of the year you're going to be examined in mammography (or) in musculoskeletal disease. It would be a worthless fellowship because those fellows are going to be doing the same thing: trying to prepare for the examination and not doing their fellowship."

While the ABR has done plenty of tinkering over the years, Capp said, sometimes the best decision is to leave things the way they are.

By Eric Barnes
AuntMinnie.com staff writer
May 17, 2002

Related Reading

ARRS president urges rads to take stock in the future, April 29, 2002

Life after residency: From protocols to physicals to malpractice, June 15, 2001

Life after residency: Making the transition to practice, June 14, 2001

Radiology residents struggle to balance career and family, April 11, 2001

Copyright © 2002 AuntMinnie.com

Page 1 of 560
Next Page