World’s First Brain-Computer VR Interface
Allows Users To Control VR With Their Minds
Neurable has built the first brain-computer interface that allows VR users to use their thoughts as a controller in virtual environments.
The prototype is attached to the back of a HTC Vive headset and is capable of reading a user’s brain signals. The thoughts have to relate to specific actions relating to the virtual activity being undertaken, which are then interpreted by the interface to trigger actions.
In a blog post, Neurable vice president Michael Thompson, explains: “This tech is already capable of typing on virtual keyboards and controlling prosthetic limbs, entirely from brain activity. Such intent-driven interactions hold tremendous promise for mixed reality environments, where current problems with user interaction constitute a significant barrier to more widespread adoption.”
Awakening Debut
The company debuted their technology along with a preview of VR game Awakening at SIGGRAPH 2017.
The game places the player in the shoes of a child endowed with telekinetic powers, who must escape the clutches of evil robots. The game requires the player to manipulate objects to beat their opponents, and features a complete lack of controller.
So how does this work?
The interface uses machine learning to interpret brain activity in real-time. This means that a user can look at a virtual object in VR, think the word ‘grab’ and the object will move towards them.
Neurable CEO and president Ramses Alcaide said:
“We have two modes. Pure EEG mode, which just determines the object you want and brings it to you directly, and we have a mode that is a hybrid BCI [brain-computer interface] mode, and in that mode we can use the eyes as a type of mouse where you can move your eyes near…the object you want to select. From there your brain tells us which one you clicked on.
Neurable is targeting VR arcades in 2018 with the full release of Awakening. The company has so far raised over $2 million and has big plans for its tech, as Alcaide elaborates.
“I think the future of mixed reality interactions is an ecosystem of solutions that incorporates voice, gesture control, eye tracking and the missing link to that entire puzzle which is brain-computer interfaces…we need some sort of system that prevents the action from happening until the user wants it to happen, and that’s where brain-computer interfaces come in.”
“In my opinion mixed reality cannot become a ubiquitous computing platform like the iPhone, or like the computer, until we have brain-computer interfaces as part of the solution.”