Gestures and their concurrent words are often said to be meaningfully related and co-expressive. Research has shown that gestures and words are each particularly suited to conveying different kinds of information. In this paper, we describe and compare three methods for investigating the relationship between gestures and words: (1) an analysis of deictic expressions referring to gestures, (2) an analysis of the redundancy between information presented in words vs. in gestures, and (3) an analysis of the semantic features represented in words and gestures. We also apply each of these three methods to one set of data, in which 22 pairs of participants used words and gestures to design the layout of an apartment. Each of the three analyses revealed a different picture of the complementary relationship between gesture and speech. According to the deictic analysis, participant speakers marked only a quarter of their gestures as providing essential information that was missing from the speech, but the redundancy analysis indicated that almost all gestures contributed information that was not in the words. The semantic feature analysis showed that participants conveyed spatial information in their gestures more often than in their words. A follow-up analysis showed that participants contributed categorical information (i.e., the name of each room) in their words. Of the three methods, the semantic feature analysis yielded the most complex picture of the data, and it served to generate additional analyses. We conclude that although analyses of deictic expressions and redundancy are useful for characterizing gesture use in differing conditions, the semantic feature method is best for exploring the complementary, semantic relationship between gesture and speech.
2024. A Corpus Study on the Difference of Turn-Taking in Online Audio, Online Video, and Face-to-Face Conversation. Language and Speech 67:3 ► pp. 593 ff.
Asalıoğlu, Esma Nur & Tilbe Göksun
2023. The role of hand gestures in emotion communication: Do type and size of gestures matter?. Psychological Research 87:6 ► pp. 1880 ff.
Arslan Aydin, Ülkü, Sinan Kalkan & Cengiz Acartürk
2021. Speech Driven Gaze in a Face-to-Face Interaction. Frontiers in Neurorobotics 15
Urbanik, Paweł & Jan Svennevig
2021. Action-Depicting Gestures and Morphosyntax: The Function of Gesture-Speech Alignment in the Conversational Turn. Frontiers in Psychology 12
Gerwing, Jennifer & Shuangyu Li
2019. Body-oriented gestures as a practitioner's window into interpreted communication. Social Science & Medicine 233 ► pp. 171 ff.
Polo, Claire & Jean-Marc Colletta
2019. The Multimodal Mediation of Knowledge: Instructors’ Explanations in a Scientific Café. Multimodal Communication 8:2
Holler, Judith, Kobin H. Kendrick & Stephen C. Levinson
2018. Processing language in face-to-face conversation: Questions with gestures get faster responses. Psychonomic Bulletin & Review 25:5 ► pp. 1900 ff.
Hilliard, Caitlin & Susan Wagner Cook
2017. A technique for continuous measurement of body movement from video. Behavior Research Methods 49:1 ► pp. 1 ff.
Hogrefe, Katharina, Wolfram Ziegler, Nicole Weidinger & Georg Goldenberg
2017. Comprehensibility and neural substrate of communicative gestures in severe aphasia. Brain and Language 171 ► pp. 62 ff.
Rowbotham, Samantha J., Judith Holler, Alison Wearden & Donna M. Lloyd
2016. I see how you feel: Recipients obtain additional information from speakers’ gestures about pain. Patient Education and Counseling 99:8 ► pp. 1333 ff.
Brône, Geert & Bert Oben
2015. InSight Interaction: a multimodal and multifocal dialogue corpus. Language Resources and Evaluation 49:1 ► pp. 195 ff.
Gerwing, Jennifer & Anne Marie Landmark Dalby
2014. Gestures convey content: An exploration of the semantic functions of physicians’ gestures. Patient Education and Counseling 96:3 ► pp. 308 ff.
Rowbotham, Samantha, Judith Holler, Donna Lloyd & Alison Wearden
2012. How Do We Communicate About Pain? A Systematic Analysis of the Semantic Contribution of Co-speech Gestures in Pain-focused Conversations. Journal of Nonverbal Behavior 36:1 ► pp. 1 ff.
Rowbotham, Samantha, Judith Holler, Donna Lloyd & Alison Wearden
2014. Handling pain: The semantic interplay of speech and co-speech hand gestures in the description of pain sensations. Speech Communication 57 ► pp. 244 ff.
This list is based on CrossRef data as of 7 september 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.