The semantic specificity hypothesis: when gestures do not depend upon the presence of a listener

K. Pine, Daniel Gurney, Ben Fletcher

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)
273 Downloads (Pure)

Abstract

Humans gesture even when their gestures can serve no communicative function (e.g., when the listener cannot see them). This study explores the intrapersonal function of gestures, and the semantic content of the speech they accompany. Sixty-eight adults participated in pairs, communicating on an object description task. Visibility of partner was manipulated; participants completed half the task behind a screen. Participants produced iconic gestures significantly more for praxic items (i.e., items with physically manipulable properties) than non-praxic items, regardless of visibility of partner. These findings support the semantic specificity hypothesis, whereby a gesture is integrally associated with the semantic properties of the word it accompanies. Where those semantic properties include a high motor component the likelihood of a gesture being produced is increased, irrespective of communication demands.
Original languageEnglish
Pages (from-to)169-178
JournalJournal of Nonverbal Behavior
Volume34
Issue number3
DOIs
Publication statusPublished - 2010

Fingerprint

Dive into the research topics of 'The semantic specificity hypothesis: when gestures do not depend upon the presence of a listener'. Together they form a unique fingerprint.

Cite this