Privacy in the Metaverse May Not Be Possible: UCB Study

Privacy in the Metaverse May Not Be Possible: UCB Study

New work by University of California Berkeley researchers suggests that privacy in the metaverse may be an impossibility. The report will come as a blow to metaverse maximalists who have long claimed the technology represents a bold new frontier for mankind.

In the study, researchers were able to show that a significant number of real metaverse users could be uniquely and reliably identified across multiple sessions, using only their head and hand motion relative to virtual objects. 

Motion Data Threatens Privacy in the Metaverse

While metaverse privacy concerns generally center on the data users disclose to the VR platform in question, UCB researchers raised another interesting point: that a person’s motion is as unique to them as a fingerprint. 

Indeed, after training a classification model on just five minutes of data per person, researchers could uniquely identify a user “amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.” Even 10 seconds of motion was enough to positively identify users with a high degree of accuracy (73.20%). 

The paper purports to be the first work to “truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.”

Researchers gathered data from 55,541 real VR users in over 40 countries using 20 different types of VR device, conducting the study at Berkeley’s Center for Responsible Decentralized Intelligence (RDI). The position and orientation of players’ head and hands were recorded every time a frame was rendered by the game studied, a VR rhythm game called Beat Saber.


After discovering the extent to which telemetry data could breach users’ privacy, the authors claimed their “research constitutes a net benefit to society by highlighting the magnitude of the VR privacy threat and motivating future work on defensive countermeasures.”

The work may give users pause when they next reach for a VR headset and arm themselves with the typical hand controllers, all three of which combine to make them easy to identify – even in a sea of tens of thousands of users. 

This begs the question, how will your biometric data will be used and stored, and who will have access to it? If a platform is breached, could your identifying data be stolen and used to track your activities in different metaverses? The security and privacy implications are enormous.

Metaverse Tech Outpacing Privacy Standards

This year, Apple will launch its long-awaited VR headset. With CEO Tim Cook having described privacy as a “human right and civil liberty,” one wonders which privacy protections the device will feature, if any. Existing privacy legislation will surely have to be adapted to account for the possible infringements that could occur based on data gleaned from such equipment.

Telecommunications giant Ericsson recently acknowledged that the “sheer innovation rate of metaverse-related tech in recent years has outpaced the development of critical privacy standards in key tech areas such as AI, biometrics, and environmental sensing.”

With the metaverse becoming ever more popular – fresh research suggests Gen Z spends five times as much time in virtual worlds as they do on social networks – better privacy protections are urgently needed. These are likely to include legal protections, certification standards, and privacy-enhancing tech such as decentralized protocols.

A new book by privacy expert Elizabeth M. Renieris, the Founding Director of the Notre Dame-IBM Technology Ethics Lab, seeks to grapple with these questions. Entitled Beyond Data: Reclaiming Human Rights at the Dawn of the Metaverse, the work argues that laws centered on data protection, privacy, security, and ownership have paradoxically failed to uphold and protect core human values, including privacy.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.