As more VR headsets equipped with advanced sensors like Apple’s Vision PRO enter the market, we're witnessing the beginning of Head Mounted Displays (HMDs) supporting neurophysiological recording and making use of Brain-Computer Interfaces (BCIs), like the case of Open BCI’s Galea headset.
This integration of XR with BCI technology is facilitated through the direct attachment of EEG sensors onto VR headsets. While their fusion is still in its early stages, preliminary studies indicate BCI performance is maintained in both virtual and augmented reality scenarios. Another approach could be modular sensors that can be integrated into headphones or worn around the ear.
Virtual stimuli and environments are highly adaptable, allowing for fine-tuning of nearly every aspect. Information can be superimposed on real life and even physically impossible scenarios can be presented. This informational richness and malleability of virtual worlds render them an ideal platform for BCI applications.
By granting spatial computing systems access to mental states, passive BCIs enable implicit control of such environments, allowing them to also become neuroadaptive. Imagine virtual characters that truly understand you and user interfaces or even entire worlds dynamically adapting to your mental or emotional state to allow a seamless user experience in ways we can only dream of.
This synergy unlocks new forms of entertainment, art and gaming, offering immersive, personalized experiences and innovative tools for cognitive training, emotion regulation and information management systems.
This synergy unlocks new forms of entertainment, art and gaming, offering immersive, personalized experiences and innovative tools for cognitive training, emotion regulation and information management systems.
In fact, we have already started to provide the first steps toward such experiences, by combining EEG-based focus measurements with the popular fantasy game Skyrim VR in our Real Virtual Magic modification of the game. Here, your magical abilities depend on your actual mental focus. This lets you experience magic as though it was real 😯 Our software, the Intuitive XR suite, is also available as an integration into the Unity 3D game engine.
Passive BCIs can also revolutionize the way we interact with virtual user interfaces. Current VR/XR interfaces deploy various selection mechanisms including gaze-based interactions, blink-based selection and pinch gestures. These mechanisms, while effective, are often impractical, unnatural or struggle to differentiate between intentional and unintentional actions, leading to user frustration.
However, through the use of passive BCIs, the intention to interact with virtual elements can be decoded and, when supplemented by eye-tracking data revealing where attention and gaze are directed, robust intuitive touchless interactions become achievable.
Through the use of passive BCIs, the intention to interact with virtual elements can be decoded and, when supplemented by eye-tracking data revealing where attention and gaze are directed, robust intuitive touchless interactions become achievable.
In fact, by correlating such data we can map virtual elements directly to the corresponding mental states to which they contribute, elucidating causal relationships. This becomes especially informative for neuroergonomics or training purposes. Or when designing new virtual worlds and interfaces.
In summary, the merger of XR with BCI technology forms a powerful synergy, with passive BCIs providing the means for implicit control and neuroadaptivity, while XR offers versatile virtual spaces for implementation. Enhanced by sensors like eye tracking, XR-BCI integration fosters new immersive and intuitive interactions and addresses shortcomings in existing virtual interaction tools.