We examined the cognitive resources
involved in processing speech with gesture compared to the same speech without
gesture across four studies using a dual-task paradigm. Participants viewed videos of a woman describing
spatial arrays either with gesture or without. They then attempted to choose the
target array from among four choices. Participants’ cognitive load was measured
as they completed this comprehension task by measuring how well they could
remember the location and identity of digits in a secondary task. We found that addressees experience additional visuospatial load when processing gestures compared to speech alone, and that the load primarily comes when addressees attempt to use their memory of the descriptions with gesture to choose the target array. However,
this cost only occurs when gestures about horizontal spatial relations (i.e.,
left and right) are produced from the speaker’s egocentric perspective.
Alibali, Martha W. (2005). Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spatial Cognition & Computation, 51, 307–331.
Baddeley, Alan (1992). Working memory. Science, 2551, 556–559.
Goldin-Meadow, Susan, Howard Nusbaum, Spencer D. Kelly, & Susan Wagner (2001). Explaining math: Gesturing lightens the load. Psychological Science, 121, 516–522.
Holler, Judith, Heather Shovelton, & Geoffrey Beattie (2009). Do iconic hand gestures really contribute to the communication of semantic information in a face-to-face context?Journal of Nonverbal Behavior, 331, 73–88.
Hostetter, Autumn B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 1371, 297–315.
Kelly, Spencer D., Peter Creigh, & James Bartolotti (2009). Integrating speech and iconic gestures in a Stroop-like task: Evidence for automatic processing. Journal of Cognitive Neuroscience, 221, 683–694.
Kelly, Spencer, Meghan Healey, Aslı Özyürek, & Judith Holler (2015). The processing of speech, gesture, and action during language comprehension. Psychonomic Bulletin & Review, 221, 517–523.
Kelly, Spencer D., Corinne Kravitz, & Michael Hopkins (2004). Neural correlates of bimodal speech and gesture comprehension. Brain and Language, 891, 253–260.
Kelly, Spencer D., Aslı Özyürek, & Eric Maris (2010). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 211, 260–267.
Kelly, Spencer D., Sarah Ward, Peter Creigh, & James Bartolotti (2007). An intentional stance modulates the integration of gesture and speech during comprehension. Brain and Language, 1011, 222–233.
Kendon, Adam (1994). Do gestures communicate? A review. Research on Language and Social Interaction, 271, 175–200.
Lakens, Daniël (2013). Calculating and reporting effect sizes to facilitate cumulative science: A practical primer for t-tests and ANOVAs. Frontiers in Psychology, 41, 863.
Maricchiolo, Fridanna, Augusto Gnisci, Marino Bonaiuto, & Gianluca Ficca (2009). Effects of different types of hand gestures in persuasive speech on receivers’ evaluations. Language and Cognitive Processes, 241, 239–266.
Paivio, Allan (1991). Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 451, 255–287.
Post, Lysanne S., Tamara van Gog, Fred Paas, & Rolf A. Zwaan (2013). Effects of simultaneously observing and making gestures while studying grammar animations on cognitive load and learning. Computers in Human Behavior, 291, 1450–1455.
Pyers, Jennie E., Pamela Perniss, & Karen Emmorey (2015). Viewpoint in the visual-spatial modality: The coordination of spatial perspective. Spatial Cognition & Computation: An Interdisciplinary Journal, 151, 143–169.
Rueckert, Linda, Ruth Breckinridge Church, Andrea Avila, & Theresa Trejo (2017). Gesture enhances learning of a complex statistical concept. Cognitive Research, 21, 1–6.
Sueyoshi, Ayano & Debra M. Hardison (2005). The role of gestures and facial cues in second language listening comprehension. Language Learning, 551, 661–699.
Sweller, John (2005). The redundancy principle in multimedia learning. In Richard E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 159–168). Cambridge University Press: Cambridge, UK.
Wagner, Susan M., Howard Nusbaum, & Susan Goldin-Meadow (2004). Probing the mental representation of gesture: Is hand-waving spatial?Journal of Memory and Language, 501, 395–407.
Wickelgren, Wayne A. (1965). Acoustic similarity and intrusion errors in short-term memory. Journal of Experimental Psychology, 701, 102–108.
Wu, Ying Choon & Seana Coulson (2014). Co-speech iconic gestures and visuo-spatial working memory. Acta Psychologica, 1531, 39–50.
Cited by (5)
Cited by five other publications
Yuan, Qingshu, Keming Chen, Qihao Yang, Zhigeng Pan, Jin Xu & Zhengwei Yao
2024. Exploring Intuitive Visuo-Tactile Interaction Design for Culture Education: A Chinese-Chess-Based Case Study. International Journal of Human–Computer Interaction 40:8 ► pp. 2099 ff.
Wu, Ying Choon, Horst M. Müller & Seana Coulson
2022. Visuospatial Working Memory and Understanding Co-Speech Iconic Gestures: Do Gestures Help to Paint a Mental Picture?. Discourse Processes 59:4 ► pp. 275 ff.
Yuan, Qingshu, Ruonan Wang, Zhigeng Pan, QianYu Meng, Shuchang Xu, Zihan Wang & Jiaxin Liu
2022. TIPTAB: A tangible interactive projection tabletop for virtual experiments. Computer Applications in Engineering Education 30:5 ► pp. 1350 ff.
Özer, Demet & Tilbe Göksun
2020. Gesture Use and Processing: A Review on Individual Differences in Cognitive Resources. Frontiers in Psychology 11
Nathan, Mitchell J., Amelia Yeo, Rebecca Boncoddo, Autumn B. Hostetter & Martha W. Alibali
This list is based on CrossRef data as of 15 october 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.