A self-adaptive eXtended reality environment for training and therapeutic purposes.
Background
Immersive eXtended reality (XR) technologies will drive the coming decade's most innovative developments in healthcare. They open up new avenues for clinical and paraclinical training and offer interactive virtual patient settings. However, to keep pace with the trend towards individualised healthcare (e.g., personalised medicine and digital therapeutics), XR environments have to become smart systems tailored to the individual needs of its users and patients. Such Adaptive XR tools rely on user feedback obtained through measurements of cognitively and physiologically relevant digital biomarkers, providing healthcare professionals with a quantitative basis for customizing XR trainings and therapies.
Project Content
EyeQTrack intends to lay the foundation for Adaptive XR technologies that enable individualised training and therapies in healthcare. We extract information on human cognitive and emotional responses from eye-tracking data streams provided by immersive XR technologies and from other physiological measurements and use advanced image processing, quantitative analysis, and machine learning/artificial intelligence (ML/AI) to process the data. In other words, we convert ocular data and other measures into information about the stress levels, cognitive loads, degree of fatigue and attentional states of people getting treatment. All of this broadens the therapeutic intervention possibilities and allows interventions to be tailored to specific requirements.
Goals
We pursue several aims in the current project, which are listed below:
- Setting up a real-time, open multi-sensor platform that receives parallel data streams of bio-signals, which are processed and sent as feedback signals to the connected XR system.
- Developing novel algorithms for estimating stress levels based on bio-signals. Such signals are composed of eye-tracking data (e.g., pupil dilation), heart rate and other cardiac parameters, skin conductance and skin temperature, and data provided by human gait analyses (3DGA).
- Developing a framework combining adaptive XR environments with gaze guidance technology. This shall, for instance, enable fall prevention training for people who have Parkinson's disease.
- Building a use case for adaptive XR training in clinical and para-clinical settings.
Methods
The EyeQTrack project is located at the intersection of several established research and teaching areas of St. Pölten UAS, namely XR, ML/AI, motor rehabilitation, nursing and emergency services training. Interdisciplinarity is an essential prerequisite of the project because we combine information from different modalities. We use eye-tracking data, data from skin conductance measures, cardiac data, and data extracted from human gait patterns, which are fed into machine learning and deep learning procedures for stress level prediction. In contrast to work conducted in the field so far, we not only develop methods to predict stress levels in healthy subjects but also include data from patients with movement and other neurodegenerative disorders. We also create an XR environment that simulates typical situations where falls occur (e.g., cluttered living room). Based on the performances of the subjects/patients, the environment automatically makes adaptions regarding the number, size, and placement of virtual obstacles. Such automatic changes in training difficulty are performed by machine learning algorithms, which process data coming from eye-tracking and physiological measures (e.g., heart rate). Thus, training sessions are dynamic and adaptable, as training advancement is determined by the learners' responses to the challenges they face.
Results
EyeQTrack's long-term aims go beyond the current project period and are directed at growing an ecosystem for adaptive XR in Healthcare. To foster innovation, we not only pool the expertise of researchers in different fields but also reach out to stakeholders in healthcare and industry. For this reason, we seek collaboration with companies across the Austrian XR innovation landscape and set up an EyeQTrack Hub, encompassing a network of interest groups from industry, academia, and clinical practice. On top of that, research results become part of degree programs in Digital Healthcare (Master’s) and Nursing (Bachelor's) at the Department Media and Digital Technologies and the Department Health Sciences respectively.
Publications
You want to know more? Feel free to ask!
Center for Digital Health and Social Innovation
- Dipl.-Ing. Lucas Schöffer BSc
- Dipl.-Ing. Djordje Slijepčević BSc
- Tarique Siragy BSc MSc PhD
- FH-Prof. Manuel Schwanda BSc MScN
- FH-Prof. Priv.-Doz. Dr. Brian Horsak
- FH-Prof. Mag. Dr. Tassilo Pellegrini
- FH-Prof. Priv.-Doz. Dipl.-Ing. Mag. Dr. Matthias Zeppelzauer
- FH-Prof. Dipl.-Wirt.-Inf. Dr. Torsten Priebe
- FH-Prof. Jakob Doppler MSc
- Adrian Vulpe-Grigorasi BEng MEng
- Kerstin Prock BSc PT, MSc
Benedikt Gollan (Research Studios Austria)
Michael Wagner (Medical University of Vienna)
Yuri Russo (University of Exeter) [UK]
Michael Leitner (University of Salzburg)
Julia Kern (Soma Reality GmbH)