Virtual Reality (VR) promises immersion, but what if that immersion turns against you? A new wave of research shows VR systems, like Meta Quest, are alarmingly vulnerable to “Inception-style” hacks. These attacks clone home screens, track every move, and blur the line between reality and manipulation—without the user ever knowing.
Last year, a study done by University of Chicago found a new kind of security vulnerability in VR systems. Inspired by the Christoper Nolan movie Inception, the attack allows hackers to create an app that injects malicious code into the Meta Quest VR system. Then it launches a clone of the home screen and apps that looks identical to the user’s original screen. Once inside, attackers are able to see, record, and modify everything the person does with the VR headset, tracking voice, motion, gestures, keystrokes, browsing activity, and even interactions with other people in real time.
Is the hair on the back of your neck standing yet? It should.
“It’s shocking to see how fragile and unsecure these VR systems are, especially considering that Meta’s Quest headset is the most popular such product on the market, used by millions of people.” — MIT Tech Review
As MIT Tech Review opines, “It’s shocking to see how fragile and unsecure these VR systems are, especially considering that Meta’s Quest headset is the most popular such product on the market, used by millions of people.”
Even more disturbing is that these attacks can apparently happen without the user noticing, and can disorient the user’s sense of reality. Studies show how long term AR VR users undergo psychological effects. Not only do they start treating things in AR or VR as real, stepping around objects as if they were really there, but also, because the human brain perceives and reacts to immersive digital experiences the same way as real-world ones, it can alter a person’s perceptions of the physical world around them.
This means VR already has the ability to infuse misinformation, deception and other problematic content into people’s brains, and deceive them physiologically and subconsciously.
And since VR technology is kinda new, people aren’t really vigilant about spotting security flaws or traps while using it.
A computer science team at UCR’s Bourns College of Engineering led by professors Jiasi Chen and Nael Abu-Ghazaleh, demonstrated that spyware can watch and record a user’s every motion and then use AI to translate those movements into words with 90% or better accuracy.
“Basically, we show that if you run multiple applications, and one of them is malicious, it can spy on the other applications.” — UCR’s Bourns College of Engineering Professor Abu-Ghazaleh
“Basically, we show that if you run multiple applications, and one of them is malicious, it can spy on the other applications,” Abu-Ghazaleh said. “It can spy on the environment around you, for example showing people are around you and how far they are. And it can also expose to the attacker your interactions with the headset.”
For example, a VR user taking a break to check Facebook messages using a virtual keyboard to air type the password could be vulnerable to spyware capturing that password. Spies can also interpret body movements to crash a virtual meeting in which confidential information is disclosed and discussed.
According to Trend Micro, wearable technology is especially susceptible to social engineering attacks because of its constant operation and the high trust levels of its users. After all, it wasn’t really designed to deploy security or authentication tools and hence, becomes an open attack surface with easy access to sensitive user data and dashboards.
Military organizations around the world are beginning to use AR and VR to conduct super immersive training programs for their soldiers. In 2019, Israeli Defence Forces (IDF) trained for tunnel warfare against Hezbollah without their physical presence underground. VR headsets conducted the trainings transporting soldiers to a simulated tunnel environment.
Imagine if an enemy can invade such training, arriving at the battlefield prepared for all the moves that soldiers have worked on.
As VR and AR creep deeper into daily life, from gaming to military training, the risks extend far beyond stolen passwords. These attacks exploit not just data but perception itself, opening the door to mass manipulation. If cybersecurity doesn’t catch up fast, the virtual world could become our most dangerous battlefield.