As much as virtual reality technology might be a fun pastime and a trendy Christmas gift, there is "The Dark Side of VR" in which it "allows the most detailed and intimate digital surveillance yet," researchers warned to The Intercept's Joshua Kopstein.
"As the tech industry continues to build VR's social future, the very systems that enable immersive experiences are already establishing new forms of shockingly intimate surveillance," Kopstein wrote.
"Once they are in place, researchers warn, the psychological aspects of digital embodiment — combined with the troves of data that consumer VR products can freely mine from our bodies, like head movements and facial expressions — will give corporations and governments unprecedented insight and power over our emotions and physical behavior."
While the internet has evolved into a rich data-mining enterprise for businesses, tracking users' activities over the web to formulate targeted marketing plans specific to the user, VR has the potential to go even further by tracking emotions, expressions and potentially even thoughts from the mind, according to report.
"The information that current marketers can use in order to generate targeted advertising is limited to the input devices that we use: keyboard, mouse, touch screen," Michael Madary, researcher at Johannes Gutenberg University, said, per The Intercept. "VR analytics offers a way to capture much more information about the interests and habits of users, information that may reveal a great deal more about what is going on in minds."
VR might even go so far to alter behaviors, the goal of many profitable businesses, according to the report.
"The goal of everything we do is to change people's actual behavior at scale," a Silicon Valley company told Harvard business professor Shoshanna Zuboff. "... We can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad. We can test how actionable our cues are for them and how profitable for us."
The newness of VR and continued development are just scratching the surface and present potential dangers of intrusion, potentially to the point of misuse or abuse.
"The technology is changing very quickly," Madary added. "But I do not think that there is any technological barrier to the kinds of manipulation that raise concerns.
"Right now I am not aware of virtual environments that change based on the data collected about each particular user. But I don’t see any reason why such a personalized dynamic virtual environment could not be developed and sold (or given) to consumers."