Biometric Psychography and its Implications on Personal Autonomy

Jessica Outlaw
3 min readOct 29, 2021

--

Written by Sara Lucille Carbonneau and originally published at www.extendedmind.io.

As the widespread adoption of AR and VR becomes more imminent, so does the potential for massive data collection at scale. XR devices will take personal data collection to the next level by combining existing data streams on people’s preferences, likes, and interests with biometric data on an ongoing basis.

While the personally identifiable nature of biometric data and the risks it entails is often discussed, the way it lends itself to creating even deeper and more intimate user profiles is less explored. But biometric data may in fact be the window into users’ most private thoughts and feelings.

In her award-winning paper released in Jan 2021, Watching Androids Dream of Electric Sheep: Immersive Technology, Biometric Psychography, and the Law, Brittan Heller introduced the concept of biometric psychography. Psychographics is the study of people’s attitudes, interests, values, and personality traits. The concept of biometric psychography, then, captures the level of intimate knowledge that companies will be able to collect on individuals using a combination of anatomical and personality-driven data.

“This term encompasses behavioral and anatomical information used to identify or measure a person’s reaction to stimuli over time, which provides insight into a person’s physical, mental, and emotional state, as well as their interests.”

- Brittan Heller, Technology and Human Rights Fellow at Carr Center for Human Rights Policy

XR headsets will track not only what people pay attention to, but for how long, with what intensity, and their specific emotional response to stimuli. This could be achieved through a combination of tracking pupil dilation, facial muscles, and in some cases galvanic skin response, electroencephalograms (EEGs), electromyography (EMGs), and electrocardiograms (ECGs).

When people see something they like, their pupils dilate. Heller equates this to an “involuntary like button,” which will offer companies a simple and easy way to measure what’s pleasurable to each individual. When combined with measurements such as galvanic skin response and electrocardiograms, they can track excitement and arousal. And the final layer of facial tracking can suggest the emotions behind the sensors, such as joy, confusion, or anger.

In these scenarios, the data collected on users will be involuntary. Unconscious and uncontrollable biological responses will be transformed into data points and users will no longer actively participate in the data collection process though clicking on links; their very reaction to stimuli will be the data.

This will make it exceptionally difficult for users to maintain a sense of mental privacy and personal autonomy. Companies will make computed profiles that can identify a range of things from a user’s sexuality to political interests without the user ever volunteering the information. Even if a user should want to self-censure, they would ultimately be unable to do so. The issue then becomes not only one of consent, but of a user’s fundamental right to privacy.

Users have a right to maintain privacy over their internal thoughts and feelings and to guard interests they don’t wish to share with others. In particular, such intimate data should not be used or sold in a way that pits the user’s data against their own interests.

Immersive technology is unique in that it requires a wide range of sensors to function. The data these sensors collect, from the stimuli users interact with to their reactions to those stimuli, is highly valuable to advertisers and other third parties. This will be a tempting route to monetization for the XR industry that lacks other clear income streams.

While XR devices have the opportunity to become the new norm of mobile technology and serve users in novel ways, we need to ensure the data it collects serves people. Heller calls on lawmakers to preemptively enact laws that will make XR safe for users before the technology becomes the new norm and not wait until after users have become the victims of privacy violations on a massive scale.

— — — — — — — —

Although research performed in 2019 suggests that it’s actually quite difficult to identify emotions using facial tracking, the myth of facial movement as a window to emotion remains.
Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M., & Pollak, S. D. (2019). Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological science in the public interest, 20(1), 1–68. https://journals.sagepub.com/stoken/default+domain/10.1177%2F1529100619832930-FREE/full

--

--

Jessica Outlaw
Jessica Outlaw

Written by Jessica Outlaw

Culture, Behavior, and Virtual Reality @theextendedmind

Responses (1)