User Experience of Privacy: some challenges with the UX of privacy features

Jessica Outlaw
4 min readMar 9, 2022

In 2021, The Extended Mind researched people’s online privacy preferences and how people managed their own privacy. It became clear to us through our 1,010 person survey of U.S. participants that many people view privacy as something desirable:

“I want my information as private as possible”
“Because privacy is important”

There are many aspects of online privacy that make it a difficult subject to engage with. Many people are often unaware of which types of data, or how much, are being collected. And they may have no idea where it’s stored, who it’s shared with, or how to protect it from being stolen.

However, there are certain aspects of online privacy that are meant to help people manage their privacy but can end up further complicating things. In this blog post, we want to explore some of the challenges with existing privacy UX that product teams can positively affect to help people engage more conscientiously with their privacy.

People distrust defaults and updates

We asked the 37% (n=374) of our survey respondents who changed their privacy settings in the past three months why they had done so; a number of them named insufficient default settings as their motivation.

“Because I felt unsafe with the previous settings.”

“I felt like all my data was unsafe under the settings they were under.”

“Did not like the new privacy terms”

“[I] want companies to have less access to my data, don’t want data about me/my habits sold, don’t want anyone to have access to my online profiles without my express permission.”

This indicates that current default settings don’t meet these people’s standards for privacy protection. And to add to the issue of settings, other respondents noted that their customized settings were sometimes changed by updates. Or, as one person stated emphatically:

“SOMETIMES MY SETTINGS GET SWITCH FROM TIME TO TIME AND NOT KNOWING — SO I HAVE TO ADJUST MY PRIVACY SETTINGS.”

“New updates contain new ways to sneak into my personal space”

When devices or products update settings in a way that surprises people, they may feel as if their privacy preferences and choices aren’t being respected. This may have an impact on consumer trust of a product in the long term.

Other researchers have found that “on average, it would take five times as long to opt out as it did to opt in for data collection” (Ng, 2019). The end result is that the process of protecting one’s online privacy can feel like more work than accepting the default settings offered.

Notify and consent models can feel like a privacy violation

Research has found that only 9% of adults always read privacy policies before agreeing to the terms and conditions while 36% never read privacy policies (Pew Research Center, 2019). Based on these numbers it’s likely that most people don’t know what they’re agreeing to when they consent to privacy policies.

In our survey, we found that some people felt violated by how companies used their data. But in some cases, the examples they cited were likely uses of their data they had agreed to through the terms and conditions.

“When I’ve visited websites casually looking at products and then I received ads and/or emails regarding what I was looking at.”

“Showing or sending emails or ads on information or products not authorized or shared with [companies].”

The people who reported these instances as privacy violations likely did not read or understand the privacy policies of the websites they visited. What type of privacy features might better serve people by communicating terms of service and the consequences of engaging with the product or service?

Takeaways

When the experience of products or services doesn’t lend itself to properly informing and assisting the people using them, it may lead to a dissonance between the user’s expectations and the consequences of interacting with them. This dissonance may be heightened when it concerns sensitive topics.

In our research 28% (n=283) of people reported there are products or services they refuse to use due to privacy concerns. Is the solution to build trust with users through offering more private default settings? Making it easier to access and change privacy settings? Something else? What we have tried to capture in the quotes and statistics of this blog post is the concern that people have around technology now.

Here are some considerations for product teams who are looking at the user experience of privacy features:

  1. Explore models of gaining consent that build trust with users.
  2. Conduct research to assess the target customers priorities around privacy and design defaults that support them.
  3. When deploying updates, check that settings important to people’s privacy are not negatively impacted.
  4. Consider what type of options people are given in the settings. Is it possible for them to opt-out of data collection and still use the product? Why or why not?

Do you want to read a December 2021 example of a company who recently automatically opted people into showing browsing history, device location, and phone numbers called? Verizon might be collecting your browsing history and here’s how to stop it: Verizon says it’s to better “understand your interests”

Resources

Ng, A. (2019, December 21). Default settings for privacy — we need to talk. CNET. https://www.cnet.com/news/default-settings-for-privacy-we-need-to-talk/

Pew Research Center. (2019, November 15). 4. Americans’ attitudes and experiences with privacy policies and laws. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/

--

--

Jessica Outlaw

Culture, Behavior, and Virtual Reality @theextendedmind