Deep-learning tool plumbs x-rays for patient gender, age

Monday, November 26 | 3:50 p.m.-4:00 p.m. | SSE06-06 | Room E353A
In this session, researchers will report early success in their development of deep convolutional neural networks (CNNs) to determine the gender and age of patients based on their chest x-rays.

The group obtained more than 112,000 frontal chest radiographs from the U.S. National Institutes of Health (NIH) database. The sample included some 48,000 women (44%) and more than 63,000 men (56%) ranging in age from 1 to 95 years. Approximately 5% of the chest x-rays were from pediatric patients.

The researchers then used 70% of the dataset to train the deep convolutional neural networks, 10% for validation, and the remaining 20% to test efficacy. During the training portion of the exercise, each x-ray image was augmented with random rotation, cropping, and flipping. The deep CNNs were also tested on an external dataset of 662 chest radiographs from adults and children from China.

Dr. Paul Yi from Johns Hopkins University School of Medicine will explain how the deep CNNs achieved an accuracy of 98% for differentiating between male and female subjects and an accuracy of 83% in discerning gender based on pediatric chest x-rays. In addition, the deep CNNs also reached an accuracy of 98% for differentiating between adult and pediatric patients in both the U.S. and Chinese sample populations.

"The ability to glean demographic information from chest radiographs may aid forensic investigations, as well as help identify novel anatomic landmarks for gender and age," which may be a "useful tool in 'forensic' radiology," the researchers concluded in their abstract.

Page 1 of 365
Next Page