In a retrospective study involving seven radiologists, a multi-institutional team led by Dr. Alyssa Watanabe of the University of Southern California (USC) found that AI-based CAD software led to an average increase of 27% in the breast cancer detection rate and a 7.2% improvement in accuracy. Both results were statistically significant. What's more, the software actually had a higher level of accuracy than the radiologists participating in the study.
"Even further improvement in [radiologist] cancer detection rate is possible with development of greater trust and continued refinements in the software," Watanabe said. She is also chief medical officer at AI software developer CureMetrix, which developed the cmAssist AI-based CAD software used in the study.
Many cancers are not detected mammographically, but up to 50% of these can be seen retrospectively on prior mammograms, Watanabe said. As a result, the researchers sought to determine whether an AI-based CAD algorithm could improve the sensitivity of radiologists in interpreting screening mammograms.
They performed a retrospective study to evaluate the AI CAD software using a dataset enriched with false-negative mammograms that had been interpreted with the assistance of traditional mammography CAD software (ImageChecker v. 10.0, Hologic). Of the 122 patients included in the study, 90 had retrospective findings on prior mammograms that were deemed actionable by two radiologists not participating in the study. To make the test set even more challenging, the researchers included the earliest prior actionable mammogram per patient, Watanabe said.
Three mammography fellowship-trained radiologists -- including a 100% academic mammographer -- participated in the reader panel, along with four general radiologists. Of the four general radiologists, two had fewer than three years in practice, and one was older than 65 years. All panelists were Mammography Quality Standards Act (MQSA) board-certified.
The cmAssist software used in the study returns a mark for lesions along with a "neuScore," which ranges from 0 to 100 and indicates the software's suspicion level for malignancy. Only lesions with high neuScores are marked. After accessing the AI CAD results, the radiologists increased their cancer detection rate by a range of 6% to 64%.
|Effect of AI-based CAD software on true-positive recall rate|
|Cancer detection rate before AI CAD||Cancer detection rate after AI CAD||Average percentage increase in cancer detection rate|
|Average for 7 readers||51%||62%||27%|
Benefiting all readers
The researchers also found that AI CAD raised the sensitivity of two less-experienced radiologists from 42% and 54%, respectively, to 68%. That level of performance nearly reached the 75% cancer detection rate achieved by the top reader in the study -- a fellowship-trained breast imager. In addition, the senior radiologist, who had the lowest cancer detection rate in the study, benefited the most from AI CAD. His cancer detection rate increased by 64%, climbing from 25% to 41%, according to the researchers.
Overall, there was a less than 1% increase in the false-positive recall rate from the use of AI CAD. In other results, the seven readers in the study had an average area under the curve (AUC) of 0.76 before AI CAD and 0.82 afterward. The 7.2% improvement in accuracy was statistically significant (p = 0.03). On its own, the AI CAD software had an overall AUC of 0.875, exceeding the reader panel.
"What this means is that the cancer detection rate could actually have been higher if the readers had followed the markings more consistently," Watanabe said.
Even the top reader ignored 25% of the flagged cancer lesions -- indicating the potential for even more reader benefit from AI CAD, according to the researchers.
Watanabe also noted that the software's AUC has improved further since the abstract was submitted, rising to 0.90.
"It's because [of] AI self-learning and other work done by data scientists that the software will continue to improve," she said.