Scientists develop artificial intelligence system to detect cardiac arrest in sleep
A smart speaker — like Google Home and Amazon Alexa — or smartphone lets it detect the gasping sound of agonal breathing and call for help
Washington: Scientists have developed a new artificial intelligence (AI) system to monitor people for cardiac arrest while they are asleep without touching them.
People experiencing cardiac arrest will suddenly become unresponsive and either stop breathing or gasp for air, a sign known as agonal breathing, said rese-archers at the University of Washington (UW) in the US.
A new skill for a smart speaker — like Google Home and Amazon Alexa — or smartphone lets the device detect the gasping sound of agonal breathing and call for help. Immediate Cardiop-ulmonary resuscitation (CPR) can double or triple someone’s chance of survival, but that requires a bystander to be present.
CPR is an emergency procedure that combines chest compressions often with artificial ventilation in an effort to manually preserve intact brain function.
Recent research suggests that one of the most common locations for an out-of-hospital cardiac arrest is in a patient’s bedroom, where no one is likely around or awake to respond and provide care. The new proof-of-concept tool, developed using real agonal breathing instances captured from 911 calls, detected agonal breathing events 97 per cent of the time from up to six metres away. “A lot of people have smart speakers in their homes, and these devices have amazing capabilities that we can take advantage of,” said Shyam Gollakota, an associate professor at UW. “We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there’s no response, the device can automatically call 911,” said Gollakota.
Agonal breathing is present for about 50 per cent of people who experience cardiac arrests, and patients who take agonal breaths often have a better chance of surviving. “This kind of breathing happens when a patient experiences really low oxygen levels,” said Jacob Sunshine, an assistant professor at the UW School of Medicine. “It’s sort of a guttural gasping noise, and its uniqueness makes it a good audio biomarker to use to identify if someone is experiencing a cardiac arrest,” Gollakota said.
The researchers gathered sounds of agonal breathing from real 911 calls to Seattle's Emergency Medical Services.
Since cardiac arrest patients are often unconscious, bystanders recorded the agonal breathing sounds by putting their phones up to the patient's mouth so that the dispatcher could determine whether the patient needed immediate CPR.
The researchers collected 162 calls between 2009 and 2017 and extracted 2.5 seconds of audio at the start of each agonal breath to come up with a total of 236 clips.
They captured the recordings on different smart devices and used various machine learning techniques to boost the dataset to 7,316 positive clips.
“We played these examples at different distances to simulate what it would sound like if the patient was at different places in the bedroom,” said Justin Chan, a doctoral student at UW.
“We also added different interfering sounds such as sounds of cats and dogs, cars honking, air conditioning, things that you might normally hear in a home,” said Chan. For the negative dataset, the team used 83 hours of audio data collected during sleep studies, yielding 7,305 sound samples.
These clips contained typical sounds that people make in their sleep, such as snoring or obstructive sleep apnea.
From these datasets, the team used machine learning to create a tool that could detect agonal breathing 97 per cent of the time when the smart device was placed up to six metres away from a speaker generating the sounds.
Next, the team tested the algorithm to make sure that it wouldn't accidentally classify a different type of breathing, like snoring, as agonal breathing.
“We don't want to alert either emergency services or loved ones unnecessarily, so it's important that we reduce our false positive rate,” Chan said.
For the sleep lab data, the algorithm incorrectly categorized a breathing sound as agonal breathing 0.14 per cent of the time.
The false positive rate was about 0.22 per cent for separate audio clips, in which volunteers had recorded themselves while sleeping in their own homes.
However, when the team had the tool classify something as agonal breathing only when it detected two distinct events at least 10 seconds apart, the false positive rate fell to 0 per cent for both tests.
The team envisions this algorithm could function like an app, or a skill for Alexa that runs passively on a smart speaker or smartphone while people sleep.
“This could run locally on the processors contained in the Alexa. It's running in real time, so you don't need to store anything or send anything to the cloud,” Gollakota added.