Has standardized testing ruined radiology's board exam?

2016 09 22 14 38 36 441 Answer Sheet 400

The radiology board certification exam's shift from an oral exam to a standardized test with multiple-choice questions could mean that the exam is less accurate in evaluating which candidates are ready and competent to become radiologists, according to a new article in the Journal of the American College of Radiology.

Why? Because the more standardized version of the exam adopted by the American Board of Radiology (ABR) in 2008 relies on psychometric testing, a technique that measures the suitability of participants for a role based on the role's required personality characteristics and aptitude via multiple-choice questions (JACR, March 31, 2018). This kind of testing doesn't truly assess the competence of candidates for board certification, lead author Dr. Lincoln Berland from the University of Alabama at Birmingham told AuntMinnie.com.

"Under the American Board of Medical Specialties [ABMS], the ABR has focused on the reliability of its exam -- defining 'reliability' as whether you'll get the same answer if you administer the test twice," he said. "But this definition doesn't mean that the test is valid -- that is, that it does what it is supposed to do, which is evaluate professional competence."

Valid method?

In 2008, the ABR's certification program underwent changes, including the elimination of the oral part of the exam for diagnostic radiology, changes in the timing of the certifying exam, and the redesign of the test for standardization through psychometric testing -- a framework that has been used extensively to assess public school education, according to Berland and colleagues.

Dr. Lincoln Berland from the University of Alabama at Birmingham.Dr. Lincoln Berland from the University of Alabama at Birmingham.

These changes sparked debate, including questions about the effectiveness of the ABR's testing and the validity of the psychometric method, they wrote. If psychometric testing is a mechanistic attempt to measure intellectual achievement, its essential flaw is that it cannot take into account a range of learning styles. In radiology in particular, there has been little research on the effectiveness of ABR testing, either before or after the elimination of the oral component of the certification examination, Berland said.

"For this study, we analyzed testing research outside of the medical field and found that many educators think standardized, multiple-choice testing is at least ineffective and at most harmful," he said. "Many other educational systems around the world use this type of testing either sparingly or not at all. If the point is to evaluate a person's skill in what they actually do, this technique just doesn't work. It's like giving a Tony Award for the play with the performers who are best at memorizing lines, rather than those who actually bring the play to life."

Performance-based evaluation of skills is much more effective, particularly in radiology, Berland said.

"As a profession, we've become enamored with statistics and metrics -- and these do have their place," he said. "But this emphasis has been combined with a tendency to discount observation as an evaluation technique, particularly for skills such as the ability to interpret abnormalities on an image in context, to collaborate with colleagues to refine their conclusions, to effectively report the results of exams, to communicate well with patients and referring physicians, and to develop a capacity for leadership."

These "noninterpretive" skills are taught and assessed during residency -- and residency programs are responsible for certifying that trainees are eligible to take the ABR exam -- but they are not formally tested, Berland and colleagues noted. The ABR and the Accreditation Council for Graduate Medical Education (ACGME) "artificially separate responsibility for assuring competence, with the ABR surrendering assessment of most key skills to residency training," they wrote.

Mired in old thinking?

The ABR and the American Board of Medical Specialties proclaim that they have created the "exam of the future," Berland and colleagues wrote. But, in fact, the two organizations are stuck in outdated testing concepts.

"The ABMS and ABR have been mired in testing concepts based on decades-old thinking that do not apply to modern radiology and medicine," they wrote.

The ABR's stated mission is to serve "patients, the public, and the medical profession by certifying that its diplomates have acquired, demonstrated, and maintained a requisite standard of knowledge, skill, understanding, and performance," according to the authors. However, evidence that its current certification exam serves this mission is weak.

What's needed instead is "authentic" testing -- that is, testing that evaluates what a radiologist actually does, Berland said.

" 'Authentic' testing incorporates human observation, including collaborative and interpersonal skills and the ability to make good use of electronic information, whereas multiple-choice tests lead to 'teaching to the test,' " he said.

Radiologists can be great teachers, but they're not necessarily experts in modern teaching models, Berland's goup noted. That's why it's important to get education researchers and even patient advocacy groups involved in redesigning how radiologists' competence is evaluated.

"We need to get a wide range of educational experts involved in this, as well as laypeople," Berland said. "The whole concept of how doctors should be evaluated needs to be reassessed from the ground up."

Page 1 of 64
Next Page