How contextual are people’s preferences for data usage and sharing?
The Extended Mind conducted a 1,010 person survey of U.S. participants weighted to the 2019 American Community Survey using age, gender, race, household income and census region. The goal was to assess consumer knowledge about current data practices and protections, as well as to gather consumer preferences around data management for VR and AR (XR) data. Very little research has been done on consumer XR preferences, but because widespread adoption of XR is imminent it’s important to understand user preferences in regards to data collection and privacy. This research was funded by Meta Reality Labs.
XR devices come with privacy concerns because their functionality relies on a wide variety of data streams which will include personal and identifiable information. Given this, we wanted to investigate how people would feel about that data being collected and applied across contexts. In this blog post, we’ll cover how respondents felt about their information being used across various contexts.
We were curious to see how people’s comfort levels would be impacted by using their personal data for law enforcement, product improvements, and advertising. This was inspired by Helen Nissenbaum’s theory of contextual integrity, which predicts that people’s preferences would vary based on factors such as context, appropriateness, means of distribution, and the data itself (Nissenbaum, 2004).
Our findings indicate that when it comes to personally identifiable information, privacy is contextual for about two-thirds of people. Almost a third of our survey respondents remained uncomfortable with all potential use cases of their data that we asked them about.
Selling data is the most uncomfortable use case
People are most uncomfortable with their data being sold. Almost two-thirds of our respondents were consistent in their discomfort about this use case, regardless if the data was being sold to a health insurance company, law enforcement, or advertisers.
When it came to personal data being used to make predictions about the user, people’s responses varied based on the use case. The discomfort level of 56% (n=566) for ‘predicting who they are attracted to’ may be due to the sensitive nature of the question.
The applications of data that respondents were most comfortable with were product improvement (32% uncomfortable, n=323) and customization, which may imply that people are willing to make sacrifices in their privacy either for convenience or improved product experiences. However, more research needs to be done to understand the motivations behind these responses.
People report current uses of data as privacy violations
In addition to asking about hypothetical scenarios, we asked respondents if their data had ever been used in a way that felt like a privacy violation. Eighteen percent (n=182) of respondents said yes. Here are some examples of what they consider to be privacy violations:
“The state police accessed my home video system thru an app on someone’s phone.”
“An app or service selling my information after opting out of said app or service.”
“Making purchasing suggestions based on other things previously purchased.”
Many of the hypothetical scenarios that we created for the question above reappeared here. The fact that those scenarios were experienced and named by some respondents as privacy violations may explain the high discomfort levels above.
Takeaways
People are uncomfortable with how their data is being used. Written responses from respondents made clear that some users consider existing uses of data as privacy violations. And our inquiries into hypothetical use cases illustrated that even in the best cases, almost a third of people are uncomfortable with their data being used for the given purpose.
With the widespread AR/VR adoption so close on the horizon, we need to ask ourselves how we can best protect the privacy of users. And what is the best means of protecting increasingly sensitive data streams before they become normalized?
To see the full survey results, visit: www.extendedmind.io/survey
Resources
Nissenbaum, H. (2004, February 1). Privacy as Contextual Integrity. Washington Law Review. https://core.ac.uk/download/pdf/267979739.pdf