With reference to the USC Signal Analysis and Interpretation Lab (SAIL) in collaboration with the University of California, Los Angeles suggests that Al can accurately decipher the mental health of people from speech.
Mental illnesses including bipolar disorder, schizophrenia, and major depressive disorders were analysed by voice data of people suffering from illness.
The understudy individuals were kept in record with MyCoachConnect interactive voice and mobile tool created by UCLA researchers to provide voice notes regarding their mental health states.
The recorded data is used with AI to detect changes in the clinical states and the Al produced ratings similar to how the clinicians would rate their patients.
Dr. Shri Narayanan, senior author and
Director of USC SAIL
“Machine learning allowed us to illuminatet he various clinically-meaningful dimensions of language use and vocal patterns of the patients over time and personalized at each individual level,”
The results form AI will get knowhow about the possible treatments and backlogs. Moreover different strategies could to conducted with the reference results from AI.
Dr. Armen Arevian, Director of the Jane and Terry Semel Institute Innovation Lab said “Listening to people has always been at the core of psychiatry. Our approach builds on that fundamental technique to hear what people are saying using modern Al. We hope this will help us better understand how our patients are doing and transform mental health care to be more personalized and proactive to what an individual needs,”
AI also helps to give effective feedback and suggestions to trainers and improve their treatment process.