EyeQTrack - Quantitative Eye-Tracking Analytics for Adaptive XR Training & Rehabilitation in Healthcare

A self-adaptive eXtended reality environment for training and therapeutic purposes.

Background

Immersive eXtended reality (XR) technologies will drive the coming decade's most innovative developments in healthcare. They open up new avenues for clinical and paraclinical training and offer interactive virtual patient settings.  However, to keep pace with the trend towards individualised healthcare (e.g., personalised medicine and digital therapeutics), XR environments have to become smart systems tailored to the individual needs of its users and patients.  Such Adaptive XR tools rely on user feedback obtained through measurements of cognitively and physiologically relevant digital biomarkers, providing healthcare professionals with a quantitative basis for customizing XR trainings and therapies.  

Project Content 

EyeQTrack intends to lay the foundation for Adaptive XR technologies that enable individualised training and therapies in healthcare. We extract information on human cognitive and emotional responses from eye-tracking data streams provided by immersive XR technologies and from other physiological measurements and use advanced image processing, quantitative analysis, and machine learning/artificial intelligence (ML/AI) to process the data. In other words, we convert ocular data and other measures into information about the stress levels, cognitive loads, degree of fatigue and attentional states of people getting treatment. All of this broadens the therapeutic intervention possibilities and allows interventions to be tailored to specific requirements.

Goals

We pursue several aims in the current project, which are listed below:

  • Setting up a real-time, open multi-sensor platform that receives parallel data streams of bio-signals, which are processed and sent as feedback signals to the connected XR system.
  • Developing novel algorithms for estimating stress levels based on bio-signals. Such signals are composed of eye-tracking data (e.g., pupil dilation), heart rate and other cardiac parameters, skin conductance and skin temperature, and data provided by human gait analyses (3DGA).
  • Developing a framework combining adaptive XR environments with gaze guidance technology. This shall, for instance, enable fall prevention training for people who have Parkinson's disease.
  • Building a use case for adaptive XR training in clinical and para-clinical settings.

Methods

The EyeQTrack project is located at the intersection of several established research and teaching areas of St. Pölten UAS, namely XR, ML/AI, motor rehabilitation, nursing and emergency services training. Interdisciplinarity is an essential prerequisite of the project because we combine information from different modalities. We use eye-tracking data, data from skin conductance measures, cardiac data, and data extracted from human gait patterns, which are fed into machine learning and deep learning procedures for stress level prediction. In contrast to work conducted in the field so far, we not only develop methods to predict stress levels in healthy subjects but also include data from patients with movement and other neurodegenerative disorders. We also create an XR environment that simulates typical situations where falls occur (e.g., cluttered living room). Based on the performances of the subjects/patients, the environment automatically makes adaptions regarding the number, size, and placement of virtual obstacles. Such automatic changes in training difficulty are performed by machine learning algorithms, which process data coming from eye-tracking and physiological measures (e.g., heart rate). Thus, training sessions are dynamic and adaptable, as training advancement is determined by the learners' responses to the challenges they face.

Results

EyeQTrack's long-term aims go beyond the current project period and are directed at growing an ecosystem for adaptive XR in Healthcare. To foster innovation, we not only pool the expertise of researchers in different fields but also reach out to stakeholders in healthcare and industry. For this reason, we seek collaboration with companies across the Austrian XR innovation landscape and set up an EyeQTrack Hub, encompassing a network of interest groups from industry, academia, and clinical practice.  On top of that, research results become part of degree programs in Digital Healthcare (Master’s) and Nursing (Bachelor's) at the Department Media and Digital Technologies and the Department Health Sciences respectively. 

Publications

Vulpe-Grigorasi, A. (2023). Cognitive load assessment based on VR eye-tracking and biosensors. Proceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia, 589–591. https://doi.org/10.1145/3626705.3632618
Vulpe-Grigorasi, A. (2023). Multimodal machine learning for cognitive load based on eye tracking and biosensors. 2023 Symposium on Eye Tracking Research and Applications, 1–3. https://doi.org/10.1145/3588015.3589534
Leung, V., Simone Hofbauer, Leonhartsberger, J., Kee, C., Liang, Y., & Schmied, R. (2022, 05). Influence of education systems on children’s visual behaviours as an environmental risk factor for myopia: a quantitative analysis with LIDAR-sensor tracking in classrooms. 18th International Myopia Conference, Rotterdam.

You want to know more? Feel free to ask!

Senior Researcher
Center for Digital Health and Social Innovation
Department of Health Sciences
Location: B - Campus-Platz 1
M: +43/676/847 228 670
External Staff
Stephanie Hirschbichler (St. Pölten University Hospital)
Benedikt Gollan (Research Studios Austria)
Michael Wagner (Medical University of Vienna)
Yuri Russo (University of Exeter) [UK]
Michael Leitner (University of Salzburg)
Julia Kern (Soma Reality GmbH)
Funding
FFG (COIN Aufbau) /Nr:FO999898083
Runtime
04/01/2023 – 03/31/2027
Status
current
Involved Institutes, Groups and Centers
Center for Digital Health and Social Innovation
Institute for Innovation Systems
Institute of Creative\Media/Technologies