This paper describes our general framework for the investigation of how human gestures can be used to facilitate the interaction and communication between humans and robots. Two studies were carried out to reveal which “naturally occurring” gestures can be observed in a scenario where users had to explain to a robot how to perform a home task. Both studies followed a within-subjects design: participants had to demonstrate how to lay a table to a robot using two different methods — utilizing only gestures or gestures and speech. The first study enabled the validation of the COGNIRON coding scheme for human gestures in Human–Robot Interaction (HRI). Based on the data collected in both studies, an annotated video corpus was produced and characteristics such as frequency and duration of the different gestural classes have been gathered to help capture requirements for the designers of HRI systems. The results from the first study regarding the frequencies of the gestural types suggest an interaction between the order of presentation of the two methods and the actual type of gestures produced. However, the analysis of the speech produced along with the gestures did not reveal differences due to ordering of the experimental conditions. The second study expands the issues addressed by the first study: we aimed at extending the role of the interaction partner (the robot) by introducing some positive acknowledgement of the participants’ activity. The results show no significant differences in the distribution of gestures (frequency and duration) between the two explanation methods, in contrast to the previous study. Implications for HRI are discussed focusing on issues relevant for the design of the robot’s communication skills to support the interaction loop with humans in home scenarios.
Sarne-Fleischmann, V., S. Honig, T. Oron-Gilad & Y. Edan
2017. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), ► pp. 1018 ff.
Gruneberg, Patrick & Kenji Suzuki
2014. An Approach to Subjective Computing: A Robot That Learns From Interaction With Humans. IEEE Transactions on Autonomous Mental Development 6:1 ► pp. 5 ff.
Alissandrakis, Aris & Yoshihiro MIYAKE
2009. RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, ► pp. 141 ff.
Alissandrakis, Aris, Nuno Otero & Joe Saunders
2009. RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, ► pp. 135 ff.
Saunders, Joe, Caroline Lyon, Frank Forster, Chrystopher L. Nehaniv & Kerstin Dautenhahn
2009. 2009 IEEE Symposium on Artificial Life, ► pp. 13 ff.
Sinatra, Anne M., Matthew G. Chin, Valerie K. Sims, Heather C. Lum, Nicholas Lagattuta, Mark Spitzer, Catherine Mobley & Matthew Marraffino
2009. Free Form Verbal Communication toward Robotic Entities vs. Live Entities. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 53:18 ► pp. 1422 ff.
This list is based on CrossRef data as of 4 july 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.