If you’re reading this article, I’m sure you’ve heard of virtual reality. It’s this emerging technology where computer-generated environments can be simulated in real-time. Usually, a VR headset is fitted over your eyes to visually immerse you in virtual reality. There are endless possibilities with virtual meetings, trips, and fully immersive entertainment.
Virtual reality has always been at the forefront of emerging technology. Everyone wants to know, how immersed can you be in a movie or game? What’s the next step to immersing people in truly virtual reality without being distracted by their actual reality? How can you play games with your thoughts and not sound crazy to other people? This is where Brain-Computer Interfaces come in.
No… that would lead to some really shit UX. A Brain-Computer Interface (BCI) in simple terms, is hardware that interprets brain signals and transmits them to a computer. BCIs help with communication between the brain and various machines. Before we talk about what we can do with the VR- BCI intersection, we have to learn more about BCIs. First off, there are 3 types of BCIs:
Not yet, invasive BCIs are still fairly controversial, especially since people don’t really trust hardware that is going into your head. I’m not going to lie, I wouldn’t either, not even if Neuralink paid me to.
Invasive BCIs work by placing individual electrodes directly in the cortex to measure small groups of neurons.
The base signal from the neurons is taken directly from the cortex and transmitted to a computer without any “noise” or interference. This is the best way to get signals from the brain, as it’s the closest a BCI can be to the brain. These BCIs could also stimulate your brain with neurofeedback (sending electrical signals to your brain) possibly do things like enhance, restore vision, and even upload information to your brain.
It’s literally inside your brain, and while it’s great for that reason, it’s also shady for the same reason. There are a lot of ethical, and security concerns surrounding BCIs in general, (that I won’t cover in this article) especially invasive BCIs. Imagine if someone were to hack into your BCI and fill your brain with ads, or take that data from your brain. There’s a lot of potential for invasive BCIs, but there’s also a lot of downsides, and it isn’t exactly ready for VR yet.
Semi-invasive BCIs are less extreme, but you still need surgery (Craniotomy) to use Electrocorticography (ECoG). Usually, ECoG is only used for medical purposes, but the clinical risk is low. ECoG works by placing electrodes on the surface of the brain, either the dura or the arachnoid. One of the great features of ECoGs is that they have a high spatial resolution.
Think of spatial resolution as the clarity of a picture, higher spatial resolution means there are more pixels per inch and the quality is higher. Even better, ECoGs aren’t affected by any “background noise” like eye movement or muscle movement. Often, there can be a motor in the ECog which applies neurofeedback, which is essentially stimulating your cortex.
The potential for this hardware is massive, especially for the possibility of merging BCI with VR. In this study, it was possible to make patients move a cursor in 2 dimensions with 30 minutes of training, which is insane to think about. Especially in the context of VR, imagine how easy it’s going to control a virtual character with your brain in 5 years.
Even though I’ve been researching this technology, and I know it’s fairly safe, I’m still not sure of how comfortable I am opening up my head. That’s why we use non-invasive methods like Electroencephalography (EEG) for most consumer BCIs. There are more non-invasive methods, but EEG is the most popular since it’s smaller and cheaper. A consumer EEG usually takes the form of a headset like
Muse or Emotiv. An EEG uses electrodes that are placed on your scalp to pick up electrical activity all-around your brain. What’s recorded is the voltage difference between at least 2 different electrodes.
Sadly, EEGs also have their fair share of problems too. Because of its external electrode placement, there’s often a lot of interference in the signal that has to be sorted out and cleaned up. EEGs also have low spatial resolution compared to other more invasive methods. It’s what’s expected when signals have to travel through your skull and skin to get picked up by the EEG.
But, it’s not all that bad, the tradeoff is that EEGs have really high temporal resolution. Temporal resolution is defined as “how closely the measured activity corresponds to the timing of the actual neuronal activity.” In other words, it’s the BCIs latency, when your brain does something, how fast can the BCI react. Higher temporal resolution means that there is low latency (that’s good), and low temporal resolution means the opposite (not good).
This means that EEGs, are perfect for VR! EEGs are extremely useful since for some of them, you can just put it on and go. In practice, EEGs pair really well with VR headsets since you can put electrodes onto the part of the headset which touches your scalp and use that to read your brainwaves. This is why EEGs are the only BCIs being used with VR.
I’m glad you asked, mystery reader! Clearly, if you combine BCI technology with VR, you’ve got a match made in technology heaven. Let’s dive into the design of a VR x BCI game created by Neurosky called “Throw Trucks With Your Mind.” The title basically explains it all, it’s a game about throwing trucks with your mind.
The first step to creating a game is to preprocess user data and extracting the key data. EEG data is messy, and usually, there’s interference from 4 sources:
- The EEG equipment itself
- Electrical interference not from the EEG or subject (power lines, laptops)
- Faulty leads and electrodes
- Electrical activity from the heart, eye blinking, eyeball movements, muscle movements in general.
Although certain activities from your heart, eyes, or muscles can be important to keep track of, they can be annoying. For example, if you are creating a passive BCI algorithm (an algorithm that monitors involuntary/unintentional brainwaves) that monitors calmness, why the hell should you care about your eyeball movements?
Even though I’m sure that somehow, someone could find a correlation given enough data, you probably wouldn’t care. This is why we preprocess EEG data by applying several filters, algorithms, and techniques to make sure that we are extracting the required data.
In Throw Trucks With Your Mind (referred to as “the game”), Neurosky is trying to extract your excitement levels, as well as your focus and mental effort being put in. Their algorithms are fairly complex, but essentially they’re analyzing your neural oscillations.
Woah, that sounds like fancy neuroscience right? Not really, they’re just your classified brainwaves. Neural oscillations are divided based on their speed (measured in Hertz) and there are 5 of them.
- Delta (0 to 4 Hz). Delta waves are usually associated with deep stages of sleep and meditation. Fun fact, it has the highest amplitude and the slowest rate.
- Theta (4 to 7 Hz). Theta waves are low-frequency and low-altitude waves that are emitted in sleep, daydreaming, and meditation.
- Alpha (8 to 12 Hz) Alpha waves are generated by the occipital lobe when closing the eyes or relaxing. They’re most visible over the parietal and occipital lobes.
- Beta (12 to 30 Hz) Beta waves are what your brain usually emits during your waking hours. This wave becomes small and fast when performing hard mental work, such as problem-solving, decision-making, etc. It’s the most visible in the frontal cortex during intense and focused mental activity.
- Gamma (30 to 100+ Hz) Gamma is the fastest brain wave and occurs when a person is facing a sudden situation that’s life or death or similar. Basically, when shit hits the fan, you start creating Gamma waves.
Neuosky passively monitors these neural oscillations and tracks:
- What you love and hate in the VR world
- Pleasant or unpleasant feelings, as well as the intensity of their response.
- How well you are learning while interacting with the virtual world
Honestly, I don’t really know if that scares me or intrigues me. The opportunities are truly endless, especially with the amount of data that you have access to from someone’s brain. You can get user feedback from these passive algorithms to either improve development or change the game based on your mental state.
I’m serious, it’s possible to create games that can change based on your mental state. Of course, it’s hard, and it isn’t really being done here, but it’s fun to think about the possibilities.
If you didn’t watch the Care Bears, do it. It’s pretty weird, it’s about these bears that walk around caring. Whenever these bears would encounter a problem, they would gather together and stare menacingly at the problem. They would stare so intently, and focus their collective energies on this problem with all their care. So much so, that love shot out of their chest… I am dead serious. Here is a video of them doing it, and here is another video of Dave Chappelle explaining it in case you don’t believe me.
What does this have to do with the game? Well, this is the next step in our development: Active BCI algorithms. For this game, you have to push and pull objects simply with your mental focus. You are throwing trucks with your mind after all, so some deep focus may be useful. Maybe take some tips from the Care Bears, and stare at your problem till it moves, it might work.
Active BCI algorithms work based on intentional spikes in the subject’s brainwaves. For example, if you concentrate on throwing a truck, your Beta waves would become smaller and faster. The Active BCI algorithm would detect and implement this intention. In other words, when you do the Care Bear Stare, the changes in your brainwaves are recorded and classified, and then are used to control your character.
Uh… Not yet? We are really far from completely leaving our consciousness and joining a virtual reality. BCI and VR technologies are still emerging right now, and it’s fascinating to see what these developers create. Although we are far from completely controlling a game fully with your own brain and having complete immersion, it’s in sight. You just need really good vision to see it, especially if your looking for Aincrad.