The Neuroscience of Gestures

Jessica Outlaw
3 min readNov 15, 2016

--

“How can you tell what these people are talking about?”

I’d like to persuade you that gestures are a fundamental building block of human language and thought. This begins a series of blog posts on gestures and how physical movement in VR & AR affects cognition.

Part one of this series will deal with why gestures provide a shortcut to human thought.

But first, on the tech front:
Devices to capture small hand gestures are already available (like Microsoft Hololens) and more are underway. Project Soli at Google can use radar to track micro-motions and twitches. The radar from the device senses how the user moves his hands and can interpret the intent. Link to the full Project Soli video here

Why are gestures powerful shortcuts to cognition?

I’m reposting an article from Scientific American here that answers “Why is talking with gestures so much easier than trying to talk without gesturing?” Psychology professor Michael P. Kaschak responds:

A person in a fit of rage may have trouble verbalizing thoughts and feelings, but his or her tightly clenched fists will get the message across just fine.

Gesturing is a ubiquitous accompaniment to speech. It conveys information that may be difficult to articulate otherwise. Speaking without gesturing is less intuitive and requires more thought. Without the ability to gesture, information that a simple movement could have easily conveyed needs to be translated into a more complex string of words. For instance, pointing to keys on the table and saying, ‘The keys are there,’ is much faster and simpler than uttering, ‘Your keys are right behind you on the countertop, next to the book.’

The link between speech and gesture appears to have a neurological basis. In 2007 Jeremy Skipper, a developmental psychobiologist at Cornell University, used fMRI to show that when comprehending speech, Broca’s area (the part of the cortex associated with both speech production and language and gesture comprehension) appears to ‘talk’ to other brain regions less when the speech is accompanied by gesture. When gesture is present, Broca’s area has an easier time processing the content of speech and therefore may not need to draw on other brain regions to understand what is being expressed. Such observations illustrate the close link between speech and gesture.”

Takeaways for VR/AR Designers:

  • People process information more deeply when they are gesturing
  • Verbal areas of the brain are more active when speech accompanies gestures
  • The tech exists for picking up human micro-gestures

Further reading:

https://www.scientificamerican.com/article/why-is-talking-with-gestures-easier/

--

--

Jessica Outlaw
Jessica Outlaw

Written by Jessica Outlaw

Culture, Behavior, and Virtual Reality @theextendedmind

No responses yet