Researchers found that artificial intelligence can accurately detect and diagnose colorectal cancer from tissue scans as well or better than pathologists, according to a new study in the journal Nature Communications as reported in a news release from Tulane University.
The study, which was conducted by researchers from Tulane, Central South University in China, the University of Oklahoma Health Sciences Center, Temple University, and Florida State University, was designed to test whether AI could be a tool to help pathologists keep pace with the rising demand for their services.
Pathologists evaluate and label thousands of histopathology images on a regular basis to tell whether someone has cancer. But their average workload has increased significantly and can sometimes cause unintended misdiagnoses due to fatigue. According to Tulane University.
“Even though a lot of their work is repetitive, most pathologists are extremely busy because there’s a huge demand for what they do, but there’s a global shortage of qualified pathologists, especially in many developing countries” said Hong-Wen Deng, PhD, Professor and Director of the Tulane Center of Biomedical Informatics and Genomics at Tulane University School of Medicine.
To conduct the study, Deng and his team collected over 13,000 images of colorectal cancer from 8,803 subjects and 13 independent cancer centers in China, Germany and the United States. Using the images, which were randomly selected by technicians, they built a machine-assisted pathological-recognition program that allows a computer to recognize images that show colorectal cancer, one of the most common causes of cancer related deaths in Europe and America.
“The challenges of this study stemmed from complex large image sizes, complex shapes, textures, and histological changes in nuclear staining,” Deng said. “But ultimately the study revealed that when we used AI to diagnose colorectal cancer, the performance is shown comparable to and even better in many cases than real pathologists.”
The area under the receiver operating characteristic (ROC) curve or AUC is the performance measurement tool that Deng and his team used to determine the success of the study. After comparing the computer’s results with the work of highly experienced pathologists who interpreted data manually, the study found that the average pathologist scored at .969 for accurately identifying colorectal cancer manually. The average score for the machine-assisted AI computer program was .98, which is comparable if not more accurate.