Seven Metaverse Privacy Principles

Privacy Principles

  1. IDENTITY / privacy protects identity
  2. MARGINALIZED GROUPS / privacy centers on marginalized groups
  3. MENTAL PRIVACY / privacy protects mental experience
  4. CONTEXT / privacy settings should adapt to context
  5. SPACES / privacy settings should respect the norms of the space
  6. TRUST / privacy requires trust
  7. POWER DYNAMICS / privacy requires new models of consent

IDENTITY / privacy protects identity

MARGINALIZED GROUPS / privacy centers on marginalized groups

MENTAL PRIVACY / privacy protects mental experience

CONTEXT / privacy settings should adapt to context

SPACES / privacy settings should respect the norms of the space

TRUST / privacy requires trust

POWER DYNAMICS / privacy requires new models of consent

Conclusion

You can read the expanded version of each privacy principle below. We have compiled each principle with quotes from consumers in our survey, experts from academic and industry, and elaborated on why the principle is important.

IDENTITY

  • HMDs will require certain forms of biometric data in order to function (eye tracking, hand tracking, pupil dilation, etc.). Biometric data will be able to identify individuals according to their anatomical signatures. The collection, use, and sharing of this information should be highly limited and preferably should not leave the user’s device.
  • HMDs will have the capacity to capture other data points, such as people’s behaviors, environments, and locations, that can also be used in the process of identifying individuals. However, devices should only collect these data points as far as it is necessary for their functionality and should never be stored longer than necessary, shared, or sold.
  • These issues are not exclusive to data collected from HMDs. Users’ passive digital footprint data from which identity can be inferred (browser version, cookie information, login time, browsing history, IP address, etc.) shouldn’t be collected, shared, or stored whenever possible.
  • Collection, use, storage, sharing or selling of identifiable data streams (location, behavior, biometrics, etc.) is a form of surveillance and should be avoided at all costs.
  • When personal and identifiable data must be collected and stored, it should be stored with the highest standards of security. In these cases, data protection should be a priority of teams planning to store such data from the outset and should never be an afterthought.
  • Data anonymization is not sufficient to protect users’ identities. Research has shown that it is easy to re-identify a person from an anonymized data set (Bushwick, 2019).
  • Different policy solutions are being explored. The German Data Ethics Commission has proposed that data de-anonymization potentially be made illegal and put under heavy fines (Data Ethics Commission, 2019).

MARGINALIZED GROUPS

  • Misused or insecure data has the potential to exacerbate existing forms of discrimination and create real life consequences for users.
  • Different groups are going to have different privacy concerns. For instance, facial recognition technology has been proven to more often falsely identify minorities than white people and smart home devices have become tools of domination and surveillance for women in domestic violence situations (Najibi, 2020, Bowles, 2018). And US army members’ use of fitness tracking apps revealed the location of US army bases overseas (Hern, 2018).
  • Metaverse developers should do research to understand the potential harms their technologies pose to various groups and should design for the most vulnerable groups to minimize potential harms.
  • Users’ personal data should never be used to harm or target them. It is a violation of their privacy and a breach of their trust.
  • In 2021, 18% (n=182) of people reported experiencing a privacy violation and 54% (n=545) were unsure if they had experienced a privacy violation (Outlaw, Carbonneau, et al. 2021).
  • The extension of Fourth Amendment protections to online data would help protect the privacy and security of users. Currently when data is stored on a company’s servers, it becomes more easily searched/seized by law enforcement, and is no longer due Fourth Amendment protection. How data is managed has consequences on the groups who are most vulnerable to police surveillance, search, and seizures.

MENTAL PRIVACY

  • Technological devices and the companies that create them should respect the mental privacy of users.
  • Data should not be fed into machine learning algorithms for training or prediction without express permission. If you wish to use people’s data for machine learning or algorithmic training, specifically invite people to opt-in and make the default opt-out. Also, make it possible for people to delete the data they provided if they change their mind about opting-in.
  • Data should never be used to profile people into categories based on demographics or behaviors. Within algorithmic models, these ‘tags’ could use an individual’s data against them.
  • Existing data about a user should never be used to manipulate their future decisions or behaviors. Choice architecture is powerful and should be deployed with extreme care.
  • Data should never be used to make predictions about an individual’s future. Not only could these inferences be algorithmically used to steer an individual in the direction of the prediction, but that data in the wrong hands could lead to negative consequences like higher health insurance premiums or employment rejection.
  • Any data inferred from biometric data is only ever an inference (such as assuming people’s interests based on pupil dilation). Biometric data has the potential to be used to make all types of assumptions about people from their sexual orientation to their political leanings as well as predicting their future behavior, but it is an invasion of peoples’ privacy to make assumptions about them based on involuntary responses.
  • Humans have a right to autonomy and growth. When technology supports (or pushes) existing habits, directs people through choice architecture, or predicts the future based on data from the past, they are actively working against the human capacity for transformation. Data should never have the power to erase the human element.
  • Chile recently became the first nation to protect neuro-rights. To learn more about neuro-rights and their proposed protections, read Towards a Governance Framework for Brain Data.
  • For more background on the application of brain technologies to decode and interpret mental states and processes, check out (Mecacci & Hasselager, 2017).

CONTEXT

  • People’s privacy preferences tend to be contextual, differing depending on where they are, who they are with, and the type of relationship dynamics they have with those people and/or companies.
  • People may prioritize privacy differently, even in the same context, which poses questions about whose settings would take precedence in an interaction between two people wearing HMDs with divergent privacy settings (one person with high privacy, one with low). HMD settings may need to be as relational as their users.
  • HMDs will pose new privacy concerns in social settings. Recording will be much easier than with phones and questions remain about how others will know when they are being recorded by friends, acquaintances, or strangers in the vicinity. These privacy concerns should be acknowledged by developers and social settings should be established that allow users to agree to privacy settings, between friends, within a group, at an event, or prior to any social interaction.
  • Our 2021 survey validated that people pay close attention to context in forming their privacy preferences. Only 31% (n=313) of people are uncomfortable with their data being used for product customization while 62% (n=616) are uncomfortable with that same data being sold to advertisers, which may be understood by how people contextually expect data to be used (Outlaw, Carbonneau, et al. 2021).
  • Applin and Flick (2021) critique an approach of ‘design individualism,’ arguing instead that tech should create for “a collective that is enmeshed, intertwined and exists based on multiple, multiplex, social, technological, and socio technological relationships.”
  • For additional reading, Helen Nissenbaum has written extensively on Privacy as Contextual Integrity (2004)

SPACES

  • New technology should not turn public spaces in physical reality into virtual surveillance spaces. While HMDs have the potential to record events in social spaces through the actors occupying the space, they should never become a means of surveillance.
  • HMDs should also not surveil users’ private spaces. It would be an invasion of privacy to record what’s happening in people’s homes.
  • HMDs and other technological devices should not violate the reasonable expectations of norms in public and private spaces.
  • Current location tracking practices are uncomfortable for the majority of users, which is one indicator of how environmental tracking would be perceived (Outlaw, Carbonneau, et al., 2021).
  • Sixty-two percent (n=626) of people reported they were uncomfortable being the bystander in a hypothetical video that was captured by a friend or neighbor and 66% (n=667) of people were uncomfortable being the bystander in a hypothetical video captured by a stranger (Outlaw, Carbonneau, et al., 2021).
  • AR glasses will likely give users the ability to customize their world view through color palettes, artist-designed templates, or even personal graffiti. However these customizations should not become publicly visible or invade the privacy of other users or non-users in the physical world.
  • The metaverse should not invade people’s public or private spaces. Users should opt-into the maps they want to see and participate in.
  • Bystanders and non-users should be protected from potential capture by HMDs.

TRUST

  • Companies tend to frame their definition of privacy in ways that benefit themselves, rather than centering on people’s concerns about how their data is being used. For example, a company may highlight privacy features that control what other users can see, while not allowing a person to opt-out of data aggregation by the company. The 2021 survey from The Extended Mind documented the following prevalence of consumer distrust:
    - 87% (n=879) of people use a privacy or security service (VPN, virus protection software, etc.)
    - 50% (n=505) pay for a privacy or security service (VPN, virus protection software, etc.)
    - 37% (n=374) of people have changed their privacy settings in the past 3 months
    - 28% (n=283) won’t a product or service due to privacy concerns
    - 18% (n=182) of people have experienced a privacy violation
  • Transparency does not solve consumer privacy concerns. Companies attempt to offer data ‘transparency’ to reduce consumer concerns about their data collection practices, but transparency is not the same as creating defaults that actually center on people’s desired privacy preferences.
  • The dissonance between companies’ proclaimed priority of privacy and people’s experiences of those companies’ products and services leads to consumer distrust. The Extended Mind’s 2021 privacy survey respondents expressed this distrust of various technology companies in their own words:
    - “They are all about their own security but not the consumers.”
    - “I do not trust that company at all.”
    - “They are not consistent with their data privacy or posting policies.”
    - “ I don’t trust their ability to keep data secure.”
  • To dive into more specific responses on consumer attitudes towards specific companies and products, you can download the codebook which is an appendix to the full privacy report from The Extended Mind at https://www.extendedmind.io/survey
  • Consumer education is not recommended as a strategy to build people’s trust in a company because it doesn’t fundamentally shift the power dynamics (see #7).

POWER DYNAMICS

  • Existing data collection practices leave people vulnerable to surveillance from companies, law enforcement, and other entities. This can lead to subsequent data misuses or data uses they did not knowingly consent to.
  • Users should be able to access products or services without being forced into opt-ing into a company’s data collection practices. Users need to be able to have easy ways to set boundaries around their privacy, such as via third-party intermediaries (King, 2022).
  • ‘Notify and consent’ models can disempower users by forcing them to agree to terms & conditions without any alternative. This often means users agree to what they consider privacy violations in exchange for a service.
  • Companies often employ ‘dark patterns’ that force users into certain choices, such as employing default settings that favor the company’s data collection goals or changing user’s settings with updates. These patterns are indicative of the power dynamics at play between users and companies, with consumers being at a disadvantage because they are evaluating the benefit of a product or service while being at a tremendous knowledge deficit compared to the people who designed the technology or wrote the ToS.
  • Social psychology has shown that the human default is to be agreeable. What this means, according to Vanessa Bohns, is that after agreeing to terms of service in order to use a product or service, “they may feel violated because they agreed to something they didn’t really want to do” (Carbonneau, 2021).
  • The work is often put on consumers to opt-out of data collection practices they don’t like (when the option is even available). However, data collection should not be the standard. Consumers should be automatically opted-out with the option to opt-in.
  • Users should hold the power of control over their sensitive data, not companies. The power dynamics between consumers and companies at minimum need to be reversed in favor of the consumer. Due to the knowledge asymmetry between the two groups, the companies have responsibility to create accessible new models for privacy.
  • Defaults and updates should maintain or increase people’s privacy (and trust), not weaken it.
  • To read a recent classification of data misuses, which show how personal data can be weaponized against people, check out Kröger, Miceli and Müller (2021).

Resources

--

--

--

Culture, Behavior, and Virtual Reality @theextendedmind

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Huawei’s surveillance system in Serbia threatens citizens’ rights, watchdog warns

Facial recognition camera projects raise concerns in Eastern Europe

Facial Recognition Will Soon Be Everywhere. Are We Prepared?

Do you accept this Cookie?

GDPR: The journey so far

The Government Cannot Force E-mail Companies to Copy and Save Your Account ‘Just in Case’

Two Flavors of Privacy

Civil Liberties and the Surveillance State

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jessica Outlaw

Jessica Outlaw

Culture, Behavior, and Virtual Reality @theextendedmind

More from Medium

Metaverse Madness — A life in the internet is the next frontier

Localization in the Metaverse

Privacy Must Be a New Discipline Like Law, Biology, or Philosophy

Broken brains