AUSTIN (KXAN) — Researchers at the University of Texas at Austin have developed a way for a virtual reality, or VR, headset to monitor brain activity.
The research team modified a VR headset by creating a noninvasive electroencephalogram (EEG) sensor that they installed in a Meta VR headset that can be worn comfortably for long periods. It measures the brain’s electrical activity during immersive VR interactions and can examine how people react to hints, stressors and other outside forces, according to a press release from UT.

The release said the device can be used in many ways, including helping people with anxiety, measuring attention or mental stress, and even giving a human the ability to see through the “eyes of a robot.”
“Virtual reality is so much more immersive than just doing something on a big screen,” said Nanshu Lu, a professor in the Cockrell School of Engineering’s Department of Aerospace Engineering and Engineering Mechanics who led the research. “It gives the user a more realistic experience, and our technology enables us to get better measurements of how the brain is reacting to that environment.”
The research is published in a journal called Soft Science.
Though the pairing of VR and EEG sensors is not new, the UT researchers say the ones already in the commercial sphere are costly, and their version is more comfortable for the user, which could extend the potential wearing time and open up additional applications.
“All of these mainstream options have significant flaws that we tried to overcome with our system,” said Hongbian Li, a research associate in Lu’s lab.
The research team at UT created a version that they said overcomes the issues by using technology similar to the electronic tattoos researchers at UT and Texas A&M developed earlier this year.
This technology will also play into another major research project at UT Austin, a new robot delivery network that will also serve as the largest study to date on human-robot interactions, according to the release.
Lu is a part of that project, and the VR headsets will be used by people either traveling with robots or in a remote “observatory.” The release said they will be able to watch along from the robot’s perspective, and the device will also measure the mental load of this observation for long periods.
“If you can see through the eyes of the robot, it paints a clearer picture of how people are reacting to it and lets operators monitor their safety in case of potential accidents,” said Luis Sentis, a professor in the Department of Aerospace Engineering and Engineering Mechanics who is co-leading the robot delivery project and is a co-author on the VR EEG paper.
The researchers have filed preliminary patent paperwork for the EEG, and they’re open to partner with VR companies to create a built-in version of the technology.