Extended reality encompasses a range of technologies such as augmented reality, virtual reality, and mixed reality

The growth of extended reality technology means new enhanced measures are needed to protect people’s privacy, a new study says.

The current lack of proper security measures could potentially result in the misuse of data, leading to serious consequences such as stalking, harassment, or identity theft, an expert has warned.

Professor Ana Beduschi, from the University of Exeter Law School, has called for a new framework to safeguard data privacy combining law, policy and better practices in XR.

Extended reality encompasses a range of technologies such as augmented reality, virtual reality, and mixed reality. It is already used in a variety of sectors, including healthcare and medical training. They pose a variety of data privacy risks due to the variety and volume of data collected and processed.

XR devices such as headsets are increasingly equipped with advanced sensors, cameras, and audio equipment that enable users to explore the virtual world while gathering significant amounts of biometric information. These include iris scans, voice samples, facial expressions, head and hand position, and movement.

Practices aiming at building users’ emotional profiles from data collected by XR devices and analytical inferences about their behaviours also pose many risks to data privacy. They could result in incorrect assumptions being made about individuals, potentially causing significant harm.

The study recommends XR personal data processing should abide by the core data protection principles laid down in Article 5 of the GDPR. These include ensuring that data processing is lawful, fair, and transparent, limiting the purpose for which data is used, minimising the amount of data that is collected, maintaining data accuracy, limiting the storage period and ensuring data integrity and confidentiality.

However, the GDPR does not cover all situations that can lead to risks to potential privacy harms and discrimination, so additional measures are needed. Policymakers should adopt proactive policies, including incorporating privacy-enhancing technologies by default into the design of XR technologies.

The study says standardisation practices can play an essential role in ensuring that technologies are developed in a fair and responsible manner. Transparency should be applied to disclosures about the types of data collected, the conditions of storage and processing, and any eventual data sharing with third parties. High-risk practices such as emotional profiling should be avoided, and users should be given the opportunity to opt out of privacy-intrusive features such as location tracking in XR applications.

Professor Beduschi said: “There needs to be collaborative efforts between policymakers, the technology sector, and other stakeholders such as standardisation bodies to ensure the development and deployment of XR technologies respect users’ and bystanders’ fundamental rights.

“XR providers should be transparent about the type of data collected, how it is processed, and if it is shared with third parties or reused for other purposes, such as product enhancement.

“They should only collect the data necessary for the development of their applications, and it should not be stored for longer than necessary. Pursuing every possible or technically feasible option to enhance user experience in XR environments should not come at the cost of user data privacy.”

The study says the legitimate interests of XR providers should not supersede the rights of individuals.

The EU’s Data Act, which was adopted in 2023 and will become applicable in 2025, could also become relevant to XR technologies. This applies to personal and non-personal data in the context of connected products (often referred to as Internet of Things or IoT) and related services.