UF researchers developing novel technology to use AI in the ICU
News  |  Fri - November 12, 2021 11:59 pm  |  Article Hits:273  |  A+ | a-
PHOTO: Azra Bihorac, M.D. (right), the senior associate dean for research affairs and a professor of medicine, surgery and anesthesiology in the University of Florida College of Medicine, is co-leading research teams with Parisa Rashidi, Ph.D., an associate professor in the J. Crayton Pruitt Family Department of Biomedical Engineering. Their goal is to develop and implement artificial intelligence for the intensive care unit. One of those projects involves ambient AI-assisted technology to prevent ICU delirium.

By Doug Bennett

University of Florida


Gainesville, Florida - In the intensive care unit of the future, a patient’s pain level will be captured through visual cues such as body movement and facial expressions.

Sensors will record head and limb movements, posture and mobility. Other monitors will detect and adjust light and noise levels to optimal levels. Computer algorithms will analyze the torrents of data flowing from the patient and their room, giving physicians the ability to make timelier, more precise treatment decisions.

To make that a reality, a group of University of Florida Health researchers are developing an intelligent ICU, an autonomous and highly detailed patient-monitoring system driven by artificial intelligence.

The premise is as simple as the solutions are complex: Nurses and physicians come and go from an  ICU room, noting the patient’s condition and physiological signs. But the machinery in an intelligent ICU never rests, harvesting data from cameras, wearable sensors, light sensors, noise meters and other equipment.

Capturing, analyzing and acting on that information should improve patient care and facilitate physicians’ decision-making. Ideally, researchers want the AI-driven system to be powerful and accurate enough to predict whether an ICU patient might improve or decline. Algorithms that sift through massive amounts of patient data may one day give physicians a new, crucial tool: Time to make decisions before a patient’s condition changes.

For now, there is a lot of information in an ICU room that is not being used simply because the technology doesn’t yet exist to harvest and analyze that data. A pervasive-sensing, AI-based system could change that.

“These models will allow us to take into account all of the information collected in the ICU. It should be able to provide a precise, accurate and continuous indication of the patient’s status — even when a nurse or doctor isn’t in the room,” said Parisa Rashidi, Ph.D., an associate professor in the J. Crayton Pruitt Family Department of Biomedical Engineering. 

Rashidi is collaborating with Azra Bihorac, M.D., the senior associate dean for research affairs and a professor of medicine, surgery and anesthesiology in the UF College of Medicine. Earlier this year, Bihorac and Rashidi founded the UF  Intelligent Critical Care Center, or IC3. Among its goals, the center aims to expand research into AI-driven diagnosis and clinical decision-making and improve patient care with new technology. Training the next generation of cross-disciplinary AI researchers who are focused on engineering and medicine is also a priority.

Through IC3, Bihorac and Rashidi are aiming to push UF to statewide and national prominence in ambient, immersive and AI research that ultimately benefits critically ill patients.

“The intensive care unit is particularly fertile ground for artificial intelligence,” Bihorac said. “And how we see the hospital is going to change.”

For Bihorac, Rashidi and their collaborators, that currently involves studies to facilitate visual cue recognition such as pain and agitation as well as aid physicians’ decision-making by using AI to assess a patient’s condition. Yet another project is using AI to monitor patients’ mobility and room environment as a way to ward off delirium, a brain dysfunction syndrome that is common in the ICU.

For the Intelligent ICU pilot project, the team created machine-learning models that pick up the smallest of indicators: head and limb movements, visual cues that indicate pain, posture and patient mobility. Without pervasive AI, those key patient indicators might be missed by busy nurses and physicians.

Once fully developed and tested, Bihorac said the technology in an intelligent ICU room should provide powerful tools to help physicians make more informed, timelier decisions.

“There can’t be a human caregiver in every patient’s room all the time. For most people, this will be like having the eyes of a health care provider on you all the time,” Bihorac said.

Another project, I2CU,is the first attempt to precisely predict patients’ medical trajectory and use autonomous visual assessment in the ICU. Likewise, it is the first time an artificial intelligence platform is being attempted in real time in a hospital setting, according to the researchers. The work is funded by a $2.4 million National Institutes of Health grant.

To address ICU delirium, the research team is working on a combination of pervasive sensing and machine learning techniques to develop adaptive interventions that use environmental sensing and adaptions. This work is funded by ADAPT, a $2.9 million NIH grant.

ICU delirium, an intense and sudden confusion that can include delusion and paranoia, can affect one-third to 80% of ICU patients. To combat that, researchers are using AI techniques to look at everything in an ICU environment that can potentially disrupt patients — light levels, noise and perhaps even odor. Preliminary data have showed that the noise level is an ICU can be three times greater than what is considered ideal, Bihorac said. A typical ICU room also might be too noisy, not dark enough at night or too dim during the day.

Using a cadre of sensors and AI algorithms, the researchers are working on ways to monitor patients’ mobility and circadian rhythm, the body’s 24-hour internal clock.” Reducing nightly disruptions and optimizing light and sound levels are a path to potentially preventing ICU delirium.

“The idea is to get ahead of delirium by predicting its trajectory before it gets really bad,” Rashidi said.

Results of the Intelligent ICU pilot study showed an AI-augmented patient-monitoring system is feasible. It captured a host of patient characteristics and ICU conditions, including visual cues related to pain and agitation, limb movement analysis, sound and light levels as well as room visitations. The study was the first of its kind to use AI to continuously assess critically ill patients’ and their environment using pervasive sensing, the researchers said.

Deploying the system in an actual ICU is not without challenges, the researchers noted. Optimal placement of the pervasive sensing system in a small ICU room is one issue. Wearable sensors may be difficult to place on patients already attached to medical equipment. Still, the pilot study was conceptual proof that such a system can work.

The I2CU and delirium studies, launched earlier this year, run through 2026. Testing in a clinical setting could begin in about four years, Rashidi said.

Developing algorithms that can reliably interpret expressions in a variety of patients presents significant challenges, according to Bihorac. Identifying pain and agitation through an AI system requires a significant amount of data. Any AI-augmented ICU monitoring system also has to account for a variety of situations, including patients of different races and genders as well as those who may be wearing face masks.

“Nothing can replace face to face contact and human decision-making,” Bihorac said. “But we’re very excited about the how technology can be used to help critically ill patients.”  
 
Top