Math model optimizes CT radiation for liver cancer

Wednesday, December 4 | 10:30 a.m.-10:40 a.m. | SSK18-01 | Room E353C
Researchers from Duke University have developed a mathematical model that can calculate the ideal CT radiation dose for liver cancer detection by weighing the potential harms of CT radiation against the risk of an incorrect diagnosis.

"While many strategies have been developed to ascertain radiation risk, there has been a paucity of studies assessing the clinical risk (i.e., the likelihood of not delivering a proper diagnosis)," presenter Francesco Ria, DMP, told AuntMinnie.com. "This knowledge gap makes impossible to determine the total radiological procedure risk and, thus, to properly optimize the imaging exam."

In light of this need, the group turned to mathematical modeling to develop a more comprehensive method for determining optimal CT radiation dose. They tested their model on the abdominal CT scans of 21 patients, whose scans contained projections of virtual lesions simulating liver cancer.

The math model determined that the highest CT radiation dose (20 mGy) was actually associated with a decreased risk of mortality in the next five years compared with lower radiation doses (10 mGy and 5 mGy). This trend occurred because higher radiation dose allowed for better diagnosis and lower clinical risk, and this reduction in clinical risk was a more dominant factor than radiation dose in determining total patient risk, Ria noted.

"To properly embody the essence of the as low as reasonably achievable (ALARA) principle, we need a methodology to quantitatively assess radiological risk and benefit simultaneously and to minimize the total risk to the individual patient," senior study author Ehsan Samei, PhD, said. "Our method allows us to precisely do so, holistically and objectively, maintaining lowest radiation exposure for an individual patient while simultaneously ensuring the needed clinical quality."

Page 1 of 653
Next Page