Eye on HMI - Assessment of Human-Machine Interface with wearable eye-tracking glasses

From Design Computation
(Redirected from DCIO.2022.GPQP2161)
Jump to: navigation, search
DCIO2022-Logo.png
DC I/O 2022 Slides by Maria Oskina, Zoltan Rusak and Peter Boom. https://doi.org/10.47330/DCIO.2022.GPQP2161 | Left | Left


DCIO2022 S2-2 M-Oskina.png


Abstract

More and more modern transport modalities are equipped with complex human-machine interfaces (HMI). HMI aim to narrow the information gap between the complex automation system and their human operator to ensure fast, effective interaction and decision making. We see HMI in the traffic controllers' rooms, the ADAS-equipped vehicles, the public transport drivers' rooms, and many other modern transport modes.

Designers create HMIs to effectively draw the operator’s attention to the most necessary and critical information and to facilitate accurate and fast decision making. Whether these systems adequately support human operators and achieve the intention of their designer is difficult to test objectively. [Hamilton and Grabowki 2013] showed that visual, manual and cognitive distractions of ADAS-equipped vehicles tend to distract drivers, who in turn behave less safe on the roads. There is, however, no comprehensive overview about the typical cognitive challenges operators facing in different domains of HMI applications and how these challenges can be objectively assessed.

We conducted a series of interviews on difficulties of operators’ Human-Machine interface experience with human factors experts working with in railway and ADAS systems and investigated Endsley's situation awareness theory in dynamic systems [Endsley 1995]. Our interviewees reported several typical issues from their HMI studies, including missing events on the HMI displays, information overload of operators, lack of contextual and situational awareness and, as a resulting mismatch in expected and performed operator actions. We aim to develop and objective approach based on mobile eye tracking technology that can be used to characterize operator situation awareness, decision making and task performance and validate HMI designs in specific mobility and industry applications.

The first step of our method is HAZOP analysis of the Human-Machine events and operator tasks, which results in a set of use cases for the eye-tracking experiments. In the experiments, we use wearable eye-tracking glasses combined with AI based computer vision algorithms. Wearable eyetracking enables us to conduct studies in real world scenarios, while AI based computer vision helps use to automatically identify relevant events and streamline the eye tracking data analysis workflow. With the use of glasses, we collect hotspot analysis, sequence of eye movement analysis, time to capture alarms and other parameters. Finally, we use an AI (and open AI) component in the glasses to mark the event of interest and track when the eye interacts with an area or an event of interest. We process gained data to conclude the events engagement, mistakes in responses, and missed out information and explain the root causes.

In the past period, we conducted a pilot study to validate the quality of data collected with the openeye eye-tracking equipment (https://kexxu.com/ ). In the next step, we will use validate our method in a full-size experiment. We are convinced that our insights will help to bring significant improvements in current research approaches for human factor studies about comfort, safety and effectiveness of the human-machine interaction. We also aim to apply our method in training and upskilling operators.

Keywords

Human-machine interface, Situation awareness, Mobile eye-tracking.

Bibliography

  • HAMILTON, B. & GRABOWSKI, J.G. (2013). Cognitive Distraction: Something to Think About (Technical Report). Washington, D.C.: AAA Foundation for Traffic Safety.