Blog

Overcoming the "Midas Touch" with Passive BCIs

Blog
Writer

Overcoming the "Midas Touch" with Passive BCIs

While incredible innovation is happening in the VR/XR space, most headsets in the market today still rely on traditional means like buttons and motion controllers. With a growing interest in touchless interaction, new selection methods are increasingly being explored including eye-gaze, blink and gesture-based selections.

Amongst these, eye gaze selection stands out as the most promising for touchless user interaction. Blink-based selection feels unnatural and often results in unintended actions due to natural blinking patterns, while pinch gestures encounter limitations when both hands are occupied.

The “Midas touch” problem in eye-gaze selection

Eye-gaze selection operates by tracking users' eye movements and relying on fixation dwell times for target selection. Fixation dwell time refers to the duration of time the user's gaze remains fixed on a particular element of the interface. When it surpasses a predetermined fixed threshold, it is taken to reflect the user’s intent to interact with that object, selecting it in the process.

However, even gaze-based selection presents its challenges. Since a common dwell time threshold is used for all elements in the interface, stimulus complexity is not taken into account. Stimulus complexity reflects the intricacy of the elements in an interface, with more intricate ones usually requiring more cognitive resources to evaluate and thus more time.

This means that sometimes dwell time thresholds might be too short for some elements -particularly complex ones- as by the time they elapse the user might not have finished contemplating about selecting the object or not, potentially leading to accidental selections. For the same reason, it can be hard for the user to find a place to rest his gaze without triggering a selection, especially in cluttered interfaces.

This is known as the "Midas touch problem" — the challenge of distinguishing mere fixation from intentional selection. This challenge underscores the need for interface designs that can accurately discern user intent, without inadvertently selecting unwanted elements.

Embracing touchless interaction

This is where passive brain-computer Interfaces (pBCIs) come into play, offering a transformative solution to this problem. Passive BCIs have the remarkable ability to detect intention to interact implicitly, without requiring explicit commands. By revealing through eye tracking where the user’s attention and gaze are directed and then using the passive BCI system to initiate the selection process once the user intends to do so, the Midas problem is effectively addressed.

The synergy between eye tracking and passive BCIs extends beyond the Midas problem, enabling correlations between contextual cues and changes in the user’s mental state. This allows for increased context awareness as well as neuroadaptive implementations, dynamically adjusting content for a truly adaptive user experience.

Passive BCIs thus emerge as the missing component for touchless interactions, resolving the challenges posed by the Midas touch problem. Through the fusion of eye tracking and implicit intention detection, interfaces can evolve to become more intuitive, responsive, and tailored to individual users' needs, ushering in a new era of immersive and user-centric experiences.

This is some text inside of a div block.
This is some text inside of a div block.
This is some text inside of a div block.
This is some text inside of a div block.
Writer

Other blogs

Neuralink, Passive BCIs And The Future Of Neuroadaptive Technology

Bringing science to life: testing real-world solutions beyond the lab