In a candid interview with New Zealand 1 news, Valve co-founder Gabe Newell sat down to talk all about his future vision for the brain-computer interface (BCI), and how technology is set to change everything about how we live (and play) today. From the outside, it seems as if Valve is taking baby steps, but Newell says that the research is going much faster than expected.
Newell has not been secretive about his thoughts on BCI and how it can be a “level of extinction for any form of entertainment.” His message to software developers: start thinking about how to use BCI now, because it̵
How soon? Says Newell in a conversation with News 1 that by 2022 the studios should have them in their test laboratories “simply because there is too much useful data.”
Newell talks about BCI through a patented consumer-colored lens – understandably that comes from a prominent mind behind Steam, the largest digital distribution platform for PC games, and not to mention an avid pioneer in consumer VR as we know it today.
For Newell, BCI will allow developers to one day create experiences that completely bypass traditional “meat peripherals” from older in function – eyes, ears, arms and legs – that give users access to richer experiences than today’s reality is able to provide.
“You are used to experiencing the world through your eyes, but the eyes were created by this affordable bidder who did not care about error rates and RMA, and if it was destroyed, there was no way to repair anything effectively, something that completely makes sense from an evolutionary “perspective, but does not reflect consumers’ preferences at all. So the visual experience, the visual fidelity we can create – the real world will cease to be the calculation we use for the best possible visual fidelity.”
On the road to the more immersive, highly adaptive future, Newell revealed that Valve is taking some important first steps, namely the recently unveiled partnership with OpenBCI, the neurotechnology company behind a fleet of open source, non-invasive BCI devices.
Newell says the partnership is working to provide a way for “everyone to have high resolution [brain signal] read technologies built into headphones, in a variety of modalities. ”
Back in November, OpenBCI announced that they were creating a BCI specifically for VR / AR headsets, called Galea, which sounded very similar to Valve’s Principal Experimental Psychologist Dr. Mike Ambinder described in his GDC 2019 vision for VR headsets equipped with electroencephalogram ( EEG) devices.
Although Newell does not go into detail about the partnership, he says that BCIs are set to play a fundamental role in game design in the near future.
“If you’re a software developer by 2022 and do not have one of these in your test lab, you’re making a stupid mistake,” says Newell. 1 News. “Interactive experience software developers – you definitely use one of these modified VR headbands to do it routinely – simply because there is too much useful data.”
It’s a real laundry list of things BCI could do in the future by giving software developers access to the brain, and let them ‘edit’ the human experience. Newell has been talking about this for a long time; outside the hypothetics, Newell says that short-term research in the field is so fast-paced that he is reluctant to commercialize something for fear of slowing down.
“The speed at which we learn things is so fast that you do not want to say too soon,” OK, let’s just lock it all up and build a product and go through all the approval processes, when in six months I want something that would have made possible a number of other features. ”
It is not certain if Galea is the subject of the partnership, but its alleged capabilities appear to be in line with what Newell says is coming down the road. Gelea is reportedly packed with sensors, which include not only EEG but also sensors capable of electrooculography (EOG), electromyography (EMG), electrodermal activity (EDA) and photoplethysmography (PPG).
OpenBCI says that Galea provides researchers and developers with a way to measure “human emotions and facial expressions” that include happiness, anxiety, depression, attention span and interest levels – many of the data points that can inform game developers on how to create better, more immersive games.
Assuming that such a high-tech VR headband could not “read” emotional states, it would represent a major step in a new direction for gaming. And it’s one that Valve clearly intends to exploit, as it continues to create (and sell) the most immersive gaming experiences possible.
Interested in watching the full interview? Capture the video directly News 1 website.