AI can spot early signs of Alzheimer’s in speech patterns, study shows

April 25, 2023
O’Donnell Brain Institute researcher says findings may lead to a simple screening test for early detection of cognitive impairment.

New technologies that can capture subtle changes in a patient’s voice may help physicians diagnose cognitive impairment and Alzheimer’s disease before symptoms begin to show, according to a UT Southwestern Medical Center researcher who led a study published in the Alzheimer’s Association publication Diagnosis, Assessment & Disease Monitoring.

Researchers used advanced machine learning and natural language processing (NLP) tools to assess speech patterns in 206 people – 114 who met the criteria for mild cognitive decline and 92 who were unimpaired. The team then mapped those findings to commonly used biomarkers to determine their efficacy in measuring impairment.

Study participants, who were enrolled in a research program at Emory University in Atlanta, were given several standard cognitive assessments before being asked to record a spontaneous 1- to 2-minute description of artwork.

The research team compared the participants’ speech analytics to their cerebral spinal fluid samples and MRI scans to determine how accurately the digital voice biomarkers detected both mild cognitive impairment and Alzheimer’s disease status and progression.

During the study, researchers spent fewer than 10 minutes capturing a patient’s voice recording. Traditional neuropsychological tests typically take several hours to administer. 

UT Southwestern release