Gaze-based keyboards offer a flexible way for human-computer interaction in both disabled and able-bodied people. Besides their convenience, they still lead to error-prone human-computer interaction. Eye tracking devices may misinterpret user’s gaze resulting in typesetting errors, especially when operated in fast mode. As a potential remedy, we present a novel error detection system that aggregates the decision from two distinct subsystems, each one dealing with disparate data streams.
The first subsystem operates on gaze-related measurements and exploits the eye-transition pattern to flag a typo. The second, is a brain-computer interface that utilizes a neural response, known as Error-Related Potentials (ErrPs), which is inherently generated whenever the subject observes an erroneous action. Based on the experimental data gathered from 10 participants under a spontaneous typesetting scenario, we first demonstrate that ErrP-based Brain Computer Interfaces can be indeed useful in the context of gaze-based typesetting, despite the putative contamination of EEG activity from the eye-movement artefact. Then, we show that the performance of this subsystem can be further improved by considering also the error detection from the gaze-related subsystem. Finally, the proposed bimodal error detection system is shown to significantly reduce the typesetting time in a gaze-based keyboard.
Read the full article here!