Privacy Harms: A Taxonomy to Understand Privacy Violations
Written by Sara Lucille Carbonneau
Content Warning: This article discusses the potential harms of data privacy violations, including physical harms that could come to users such as assault or murder.
Over the past year, the Extended Mind has become increasingly interested in the potential harms that could result from XR data collection and sharing privacy violations. This interest arises from the biometric data that XR collects on users, which increases the scope of data collected on individuals and creates inherent privacy risks by combining sensitive personal identification data with existing data streams, thereby making the process of anonymization more difficult.
In a study performed by Stanford University, they found that machine learning algorithms were able to identify one out of 511 people with 95% accuracy using a 20 second sample of a 360 video collected using VR.¹
Twenty seconds is a very small sample size for such high accuracy rates. As the prevalence of VR and AR grows, few people will be able to escape having 20 seconds of XR data collected on their person. To better understand the types of harms that result from privacy violations, we turned to Privacy Harms, a recent legal studies research paper written by Danielle Keats Citron and Daniel J. Solove, that created a taxonomy of privacy violation harms. While these harms are not specific to XR, they outline what is at stake when it comes to data, privacy, and safety.
In this post, we will walk through each of the harms outlined in the paper by these legal scholars. We have organized these harms into two sections: recognized harms and disputed harms. Recognized harms have been recognized by courts in multiple cases, thus creating a precedence for considering such violations a harm under the law. Disputed harms, on the other hand, have either not been recognized or inconsistently recognized by the courts and are thus areas of vulnerability for user privacy.
We at the Extended Mind are not attorneys and are not trained in all of the nuances of these legal cases. However, we believe that technologists would benefit from having access to this taxonomy of privacy harms.
These are harms that have been consistently recognized under the law and supported by the court system.
“The improper sharing of personal data can create unique opportunities for physical violence.”² The internet has made it easier to access people’s personal data and criminals can leverage these tools to find and harm their targets.
I.e. stalking, breaking and entering, assault, rape, murder
“Privacy violations can result in financial losses that the law has long understood as cognizable harm… Many cases involving economic harm are data breach cases… Plaintiffs have difficulty providing a causal link between particular data breaches and identity theft.”²
I.e. identity theft, fraudulent charges, economic loss
“Reputational harms impair a person’s ability to maintain ‘personal esteem in the eyes of others’ and can taint a person’s image.They can result in lost business, employment, or social rejection.”²
I.e. libel and slander, defamation
“One of the most common types of harm caused by privacy violations is emotional distress. Emotional distress encompasses a wide range of emotions, including annoyance, frustration, anger, and various degrees of anxiety.”²
I.e. Mental pain and distress, feelings of violation, mortification, fear, humiliation, and embarrassment
“Privacy violations can harm personal and professional relationships as well as relationships with organizations. People modulate personal relationships by maintaining boundaries around their information or by withholding information from some people and not others.”²
I.e. Loss of confidentiality, job loss, ostracization from community
Chilling Effect Harms
“Privacy violations can produce harm by inhibiting people from engaging in certain civil liberties such as free speech, political participation, religious activity, free association, freedom of belief, and freedom to explore ideas. Such harm is often called a ‘chilling effect.’…Chilling effects have an impact on individual speakers and society at large as they reduce the range of viewpoints expressed and the nature of expression that is shared.”²
I.e. Communication monitoring, sharing or selling privately shared information
“Privacy violations can cause discrimination harms, which involve entrenching inequality and disadvantaging women and people from marginalized communities… The civil rights legal tradition has the capacity and vocabulary to address discrimination harm… But these laws still have not been applied sufficiently to privacy violations.”²
I.e. Disproportionate surveillance, targeted harassment, threats, or doxing, discrimination in one’s ability to work, attend school, use the telephone, secure housing, and vote on equal terms
These are harms that have either not been recognized or inconsistently recognized by the law.
Thwarted Expectation Harms
“A common type of privacy violation involves thwarting people’s privacy expectations by breaking promises made about the collection, use, and disclosure of personal data. Courts are generally dismissive of thwarted expectations as a cognizable harm unless it is accompanied by other harms, such as reputational, economic, or emotional harm.”²
I.e. Improper sharing or selling of data that leads to individual harm (usually economic, reputational, or emotional)
I.e. Inability to know how long data is stored for, inability to know how data is used
J. Data Quality Harms
“Many privacy laws require that organizations adhere to the principle of ‘data quality’ — keeping data accurate, complete, and up-to-date. Courts are inconsistent in whether inaccuracies in data constitutes a cognizable harm.”² Often, other forms of resultant harm must be identified (economic, discrimination, etc.) for data quality harms to be acknowledged.
I.e. Improper identification, improper reporting (of credit, marital status, profession, education, etc.), potential discrimination or targeting
Informed Choice Harms
“Courts are inconsistent about recognizing harm for failing to give individuals information to assist them in making informed choices about their personal data or exercise of privacy rights.”²
I.e. Failing to inform individuals of their rights, failing to provide job applicants background check data that caused them to lose a job opportunity
“Courts are inconsistent in finding harm for failing to follow security safeguards that have not yet resulted in a data breach.”² While some courts have found that failing to follow security safeguards constitutes a breach of confidentiality, others have claimed that sufficient harm cannot be proved without proof of consequent harms, such as identity theft .
I.e. Printing more than 5 numbers of a credit card on receipts (per FCRA mandates), failure to enforce single login service promises
“Disturbance harms involve unwanted communications that disturb tranquility, interrupt activities, sap time, and otherwise serve as a nuisance.”² Courts have made disparate rulings over whether unsolicited telephone calls and text messages constitute harm.
I.e. Unsolicited telephone calls and text messages
“Autonomy harms involve the restriction, coercion, or manipulation of people’s choices. People are either directly denied free will to decide or are tricked into thinking that they are freely making choices when they are not. In the consumer privacy context, the most prevalent form of autonomy harm is ‘manipulation’… Manipulation has not been the subject of many privacy cases.”²
I.e. implementing personal data on a massive scale to influence voting, manipulation of consumer decision-making
Harms that have a direct impact on individuals, such as physical, economic, or emotional harms, have been recognized by the judicial system as punishable by law. When privacy violations harm individuals in these ways, the courts require those responsible for the violations to make amends for the injury.
However, the courts have failed to regulate the data management strategies that often lead to privacy violations. When companies fail to delete data according to their policies or fail to follow security safeguards, the courts do not recognize these acts as harms in and of themselves because they are merely creating the potential for harm, but have not tangibly created harm yet. This forces the courts to grapple with the future of data rather than its present.
Furthermore, the inherent harms of these data management strategies, such as the loss of control over one’s data or the failure to offer people the choice to make informed choices about their data, has also not been recognized by courts. This means that courts have not prioritized user control, transparency, or choice when it comes to their own data management.
Ultimately, the courts have not been preventive in their approach to privacy violation harms. Their current approach rests on remediating harms when they happen rather than preventing their occurrence in the first place. But as data is increasingly collected on people over time, including highly personal biometric data, data management and privacy regulation needs to become a priority in order to ensure the safety and wellbeing of users.
About the author: Sara Lucille Carbonneau is a researcher on The Extended Mind team with a passion for ethics and privacy in technological innovation.