By Erik L. Ridley, AuntMinnie staff writer

August 9, 2017 -- Computer-aided detection (CAD) software can help monitor multiple sclerosis progression on MRI, bridging the skill gap between neuroradiologists and nonspecialized, untrained imaging readers, according to recent research published online in the Journal of the American College of Radiology.

In a proof-of-concept study, researchers from the University of Melbourne found that three untrained readers -- a radiology resident, surgical resident, and a medical student -- could detect significantly more new lesions using the institution's internally developed multiple sclerosis lesion detection and monitoring software than could experienced neuroradiologists performing only conventional side-by-side comparison of consecutive MRI studies without CAD.

What's more, both inexperienced readers and subspecialized neuroradiologists tended to detect the same new lesions when using the CAD software.

"This [performance] could, in theory, translate to improved image reporting at the point of care in the outpatient setting," wrote the study authors, led by Dr. Ariel Dahan of the University of Melbourne.

Evaluating MS lesions

MRI findings are the primary biomarkers used to measure disease progression in multiple sclerosis, because most demyelinating events are asymptomatic. Radiologists typically evaluate MS lesions by comparing chronological studies side by side, but the sensitivity of this approach is degraded by multiple human, technical, and technological factors, according to the authors.

To improve on this process, a University of Melbourne team developed VisTarsier (VTS), a semiautomated software platform designed to assist in detection and monitoring of MS lesions. An initial evaluation published in 2015 reported that two fellowship-trained neuroradiologists detected more lesions when using the software.

The group next wanted to test its hypothesis that nonneuroradiologist medical professionals using the university's MS comparison software would show improved lesion detection and interreader variability compared with neuroradiologists performing conventional side-by-side comparisons of volumetric fluid-attenuated inversion-recovery (FLAIR) MRI studies (JACR, July 29, 2017).

Three untrained readers -- a final-year medical student, a surgical resident, and a fourth-year radiology trainee -- used the VTS software to review the same image pairs read by the two subspecialty neuroradiologists in the previous study. A total of 76 comparative study pairs of consecutive MRI exams were included again for the research; 38 of these were considered by the VTS software to be stable (no disease progression), and 38 were deemed to be unstable (disease progression). The study included patients with a confirmed diagnosis of MS, an availability of a diagnostic-quality MRI volumetric FLAIR sequence, and a clinical report completed by a fellowship-trained neuroradiologist with at least three years of experience and using conventional side-by-side comparison of chronological scans.

Blinded to each other's findings and the existing radiology report, the readers used VTS and viewed all images in the sagittal plane on a standard color screen with 960 x 640-pixel resolution after opening each comparative study pair. For the purposes of the study, new lesions were defined as new foci of increased T2 FLAIR signal in previously normal white matter. The software highlighted these in orange during scrolling through sagittal, colored change maps; each new lesion was correlated to coregistered and resectioned, but otherwise conventional, source T2 FLAIR images in the sagittal plane of both old and new studies. This step is performed to assess whether the foci were true findings or artifacts, according to the researchers.

Better performance

After comparing the reader results with the original radiology reports issued for each case, the researchers found that the software improved lesion detection for all readers. In fact, the untrained readers were able to identify significantly more lesions than were identified on the original report produced by an experienced neuroradiologist.

Impact of VTS software on detection of new lesions
Interpretation type Study pairs with new lesions
Originally issued radiology report (without software) 10
Radiology resident (with software) 41
Medical student (with software) 37
Neuroradiology fellow (with software) 37
Neuroradiologist (with software) 35
Radiology trainee (with software) 34

The difference was statistically significant for all readers (p ≤ 0.002).

"Even more striking was the significant number of study pairs initially reported as 'stable' or 'no disease progression' by subspecialist neuroradiologists using [conventional side-by-side comparison] when they, in fact, had new white-matter lesions now identified by untrained readers using CAD," the authors wrote. "Importantly, the newly identified lesions had substantial interobserver agreement across both trained and untrained reader groups, implying the new lesions reported tended to be the same ones across all readers using CAD."

In other findings, mean reporting times for each study pair was less than two minutes for each of the three untrained readers.

Within the limitations of the proof-of-concept study, the authors concluded that semiautomated software can, in principle, bridge the gap between untrained readers and subspecialized neuroimaging readers for monitoring multiple sclerosis. This could potentially lead to improved outpatient image reporting at the point of care, they noted.

"Furthermore, the future of radiology may see software similar to the one reported here that would have profound implications on how imaging is used, particularly in the setting of disease monitoring," the authors wrote. "As the need for accurate and rapid radiology reporting increases across all medical and surgical specialties, it is likely that in a number of scenarios (e.g., MS disease progression) nonsubspecialist radiology-trained healthcare providers using CAD may be able to perform as well as subspecialty radiologists."


Copyright © 2017 AuntMinnie.com
 

To read this and get access to all of the exclusive content on AuntMinnie.com create a free account or sign-in now.

Member Sign In:
MemberID or Email Address:  
Do you have a AuntMinnie.com password?
No, I want a free membership.
Yes, I have a password:  
Forgot your password?
Sign in using your social networking account:
Sign in using your social networking
account: