Edited by Jean-Marc Colletta and Michèle Guidetti
[Benjamins Current Topics 39] 2012
► pp. 79–98
We have proposed that gestures play a significant role in directing infants’ attention during early word learning when caregivers synchronize the saying of a word with a dynamic gesture; this synchronization functions to bring sight and sound together, providing a basis for perceiving them as belonging together (Zukow-Goldring & Rader, 2001). To test this claim, we presented 9–14 month-old infants with videos of speakers using synchronous dynamic vs. static gestures (Study 1) or synchronous dynamic vs. asynchronous dynamic gestures (Study 2) while introducing a novel object. Eye tracking allowed us to measure where infants looked over time during the word–object pairings and during a test for word learning. We hypothesized that dynamic gestures would draw infants’ attention from the mouth to the object, that infants would attend more to the object at the time the word was spoken when the gesture was dynamic and synchronous with speech, and that synchrony of gesture and speech would result in better word learning. These three hypotheses were supported.
This list is based on CrossRef data as of 24 april 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.