A $14 million research project is getting underway in the US to see whether cancer and other diseases can be diagnosed by picking up subtle changes in a person’s voice patterns.
The National Institutes of Health (NIH)-funded project is being run by researchers at the University of South Florida in collaboration with Weill Cornell Medicine in New York City, 10 other institutions in the US and Canada, and with the help of French/US artificial intelligence specialist Owkin.
Called Voice as a Biomarker for Health, the programme is one of several receiving NIH support under the agency’s just-announced Bridge2AI initiative, which has set aside $130 million in funding to accelerate the use of AI technologies in biomedical settings.
It aims to develop an extensive database of human voices, both healthy and sick, that can be used to train AI algorithms to detect changes that could be a sign of cancer, neurological and psychiatric disorders like Alzheimer’s or depression, respiratory illnesses such as pneumonia, and voice/speech disorders, including language delay and autism.
There are already a number of examples of projects that have used patient sounds as a biomarker to detect disease including, for example, recordings of coughing to screen people for COVID-19 infection.
Vocal patterns are also starting to attract attention as potential diagnostic tools for conditions, including post-traumatic stress disorder (PTSD), depression, and stress.
“Although preliminary work with voice data has been promising, limitations to integrating voice as a biomarker in clinical practice have been linked to small datasets, ethical concerns around data ownership and privacy, [and] bias and lack of diversity of the data,” say the partners.
“To solve these, the Voice as a Biomarker of Health project is creating a large, high-quality, multi-institutional and diverse voice database that is linked to identity-protected/unidentifiable biomarkers from other data, such as demographics, medical imaging, and genomics,” they add.
The NIH-backed voice project hopes to develop AIs that can be used to “empower doctors with a low-cost diagnostic tool to be used alongside other clinical methods.” It will be backed by $3.8 million in its first year, with the remainder of the $14 million subject to NIH getting budget approvals for the following three years.
“Vocal biomarkers are set to play an increasingly important role in healthcare,” commented Thomas Clozel, co-founder and chief executive of Owkin.
“We are excited to be using federated learning, our privacy-preserving AI framework, to connect the medical world together in the pursuit of improving outcomes for patients,” he added.
Other projects being funded by the Bridge2AI programme are looking at generating standards for biomedical AI, and using machine learning to build a genomic “translator”, based on maps of cell architecture.