Article published In:
Interaction Studies
Vol. 22:2 (2021) ► pp.141176
References (67)
Admoni, H., Hayes, B., Feil-Seifer, D., Ullman, D., & Scassellati, B.
(2013) Are you looking at me? Perception of robot attention is mediated by gaze type and group size. 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 389–395. DOI logoGoogle Scholar
Admoni, H., & Scassellati, B.
(2017) Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction, 6(1), 25. DOI logoGoogle Scholar
Ahmadzadeh, S. R., Paikan, A., Mastrogiovanni, F., Natale, L., Kormushev, P., & Caldwell, D. G.
(2015) Learning symbolic representations of actions from human demonstrations. 2015 IEEE International Conference on Robotics and Automation (ICRA), 3801–3808. DOI logoGoogle Scholar
Aliasghari, P., Ghafurian, M., Nehaniv, C. L., & Dautenhahn, K.
(2021) Effects of gaze and arm motion kinesics on a humanoid’s perceived confidence, eagerness to learn, and attention to the task in a teaching scenario. HRI ’21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 197–206. DOI logoGoogle Scholar
Andrist, S., Mutlu, B., & Tapus, A.
(2015) Look like me: Matching robot personality via gaze to increase motivation. CHI ’15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 3603–3612. DOI logoGoogle Scholar
Barrick, M. R., & Mount, M. K.
(1991) The big five personality dimensions and job performance: A meta-analysis. Personnel Psychology, 44 (1), 1–26. DOI logoGoogle Scholar
Bartneck, C., Duenser, A., Moltchanova, E., & Zawieska, K.
(2015) Comparing the similarity of responses received from studies in Amazon Mechanical Turk to studies conducted online and with direct recruitment. PLOS ONE, 10(4), 1–23. DOI logoGoogle Scholar
Bates, D., Mächler, M., Bolker, B. M., & Walker, S. C.
(2015) Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67 (1). DOI logoGoogle Scholar
Billard, A., Calinon, S., Dillmann, R., & Schaal, S.
(2008) Robot programming by demonstration. Springer handbook of robotics (pp. 1371–1394). Springer. DOI logoGoogle Scholar
Birdwhistell, R. L.
(1983) Background to kinesics. ETC: A Review of General Semantics, 40 (3), 352–361.Google Scholar
Biswas, M., Romeo, M., Cangelosi, A., & Jones, R. B.
(2020) Are older people any different from younger people in the way they want to interact with robots? Scenario based survey. Journal on Multimodal User Interfaces, 14(1), 61–72. DOI logoGoogle Scholar
Bozdogan, H.
(1987) Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions. Psychometrika, 52 (3), 345–370. DOI logoGoogle Scholar
Breazeal, C.
(2009) Role of expressive behaviour for robots that learn from people. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3527–3538. DOI logoGoogle Scholar
Brooks, A. G., & Arkin, R. C.
(2007) Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous Robots, 22 (1), 55–74. DOI logoGoogle Scholar
Bruneau, T.
(2012) Chronemics: Time-binding and the construction of personal time. ETC: A Review of General Semantics, 69(1), 72–92.Google Scholar
Cangelosi, A., & Stramandinoli, F.
(2018) A review of abstract concept learning in embodied agents and robots. Philosophical Transactions of the Royal Society B: Biological Sciences, 373(1752), 2–7. DOI logoGoogle Scholar
Chao, C., Cakmak, M., & Thomaz, A. L.
(2010) Transparent active learning for robots. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 317–324. DOI logoGoogle Scholar
Claret, J. A., Venture, G., & Basañez, L.
(2017) Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task. International Journal of Social Robotics, 9(2), 277–292. DOI logoGoogle Scholar
Dautenhahn, K.
(2007) Socially intelligent robots: Dimensions of human-robot interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, 362 (1480), 679–704. DOI logoGoogle Scholar
Di Cesare, G.
(2020) The importance of the affective component of movement in action understanding. Modelling human motion: From human perception to robot design (pp. 103–116). Springer. DOI logoGoogle Scholar
Dragan, A., & Srinivasa, S.
(2013) Generating legible motion. Proceedings of Robotics: Science and Systems. DOI logoGoogle Scholar
Emery, N.
(2000) The eyes have it: The neuroethology, function and evolution of social gaze. Neuroscience Biobehavioral Reviews, 24(6), 581–604. DOI logoGoogle Scholar
English, B. A., Coates, A., & Howard, A.
(2017) Recognition of gestural behaviors expressed by humanoid robotic platforms for teaching affect recognition to children with autism – A healthy subjects pilot study. International Conference on Social Robotics, 567–576. DOI logoGoogle Scholar
Farroni, T., Csibra, G., Simion, F., & Johnson, M. H.
(2002) Eye contact detection in humans from birth. Proceedings of the National Academy of Sciences, 99 (14), 9602–9605. DOI logoGoogle Scholar
Feil-Seifer, D., Haring, K. S., Rossi, S., Wagner, A. R., & Williams, T.
(2020) Where to next? the impact of covid-19 on human-robot interaction research. ACM Transactions on Human-Robot Interaction, 10 (1). DOI logoGoogle Scholar
Fischer, K., & Saunders, J.
(2012) Getting acquainted with a developing robot. HBU 12: Proceedings of the Third international conference on Human Behavior Understanding, 125–133. DOI logoGoogle Scholar
Funke, F., & Reips, U.-D.
(2012) Why semantic differentials in web-based research should be made from visual analogue scales and not from 5-point scales. Field Methods, 24(3), 310–327. DOI logoGoogle Scholar
Ghafurian, M., Budnarain, N., & Hoey, J.
(2019) Improving humanness of virtual agents and users’ cooperation through emotions. [URL]
Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., & Scherer, K.
(2011) Toward a minimal representation of affective gestures. IEEE Transactions on Affective Computing, 2(2), 106–118. DOI logoGoogle Scholar
Hietanen, J. K., Leppänen, J. M., Peltola, M. J., Linna-aho, K., & Ruuhiala, H. J.
(2008) Seeing direct and averted gaze activates the approach-avoidance motivational brain systems. Neuropsychologia, 46(9), 2423–2430. DOI logoGoogle Scholar
Hortensius, R., Hekele, F., & Cross, E. S.
(2018) The perception of emotion in artificial agents. IEEE Transactions on Cognitive and Developmental Systems, 10(4), 852–864. DOI logoGoogle Scholar
Hosseinpanah, A., Kramer, N. C., & Straßmann, C.
(2018) Empathy for everyone?: The effect of age when evaluating a virtual agent. HAI ’18: Proceedings of the 6th International Conference on Human-Agent Interaction, 184–190. DOI logoGoogle Scholar
Huang, S. H., Huang, I., Pandya, R., & Dragan, A. D.
(2020) Nonverbal robot feedback for human teachers. Proceedings of the Conference on Robot Learning, 1038–1051. [URL]
Ito, A., Hayakawa, S., & Terada, T.
(2004) Why robots need body for mind communication – An attempt of eye-contact between human and robot. RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication, 473–478. DOI logoGoogle Scholar
Jonell, P., Kucherenko, T., Torre, I., & Beskow, J.
(2020) Can we trust online crowdworkers? comparing online and offline participants in a preference test of virtual agents. IVA ’20: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents. DOI logoGoogle Scholar
Joo, H., Simon, T., Cikara, M., & Sheikh, Y.
(2019) Towards social artificial intelligence: Nonverbal social signal prediction in a triadic interaction. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 10865–10875. DOI logoGoogle Scholar
Kaiser, F. G., Glatte, K., & Lauckner, M.
(2019) How to make nonhumanoid mobile robots more likable: Employing kinesic courtesy cues to promote appreciation. Applied Ergonomics, 78 1, 70–75. DOI logoGoogle Scholar
Kazuaki, T., Motoyuki, O., & Natsuki, O.
(2010) The hesitation of a robot: A delay in its motion increases learning efficiency and impresses humans as teachable. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 189–190. DOI logoGoogle Scholar
Kim, J., Cauli, N., Vicente, P., Damas, B., Cavallo, F., & Santos-Victor, J.
(2018) ICub, clean the table! A robot learning from demonstration approach using deep neural networks. 18th IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC 2018, 3–9. DOI logoGoogle Scholar
Kim, J., Kwak, S. S., & Kim, M.
(2009) Entertainment robot personality design based on basic factors of motions: A case study with ROLLY. RO-MAN 2009 – The 18th IEEE International Symposium on Robot and Human Interactive Communication, 803–808. DOI logoGoogle Scholar
Koenig, N., Takayama, L., & Matarić, M.
(2010) Communication and knowledge sharing in human-robot interaction and learning from demonstration. Neural Networks, 23 (8–9), 1104–1112. DOI logoGoogle Scholar
Kulić, D., & Croft, E.
(2007) Physiological and subjective responses to articulated robot motion. Robotica, 25(1), 13–27. DOI logoGoogle Scholar
Laird, J. E., Gluck, K., Anderson, J., Forbus, K. D., Jenkins, O. C., Lebiere, C., Salvucci, D., Scheutz, M., Thomaz, A., Trafton, G., et al.
(2017) Interactive task learning. IEEE Intelligent Systems, 32(4), 6–21. DOI logoGoogle Scholar
Maljkovic, V., & Nakayama, K.
(1994) Priming of pop-out: I. Role of features. Memory & Cognition, 22(6), 657–672. DOI logoGoogle Scholar
Matejka, J., Glueck, M., Grossman, T., & Fitzmaurice, G.
(2016) The effect of visual appearance on the performance of continuous sliders and visual analogue scales. CHI ’16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 5421–5432. DOI logoGoogle Scholar
Mavridis, N.
(2015) A review of verbal and non-verbal human-robot interactive communication. Robotics and Autonomous Systems, 63(P1), 22–35. DOI logoGoogle Scholar
Metta, G., Fitzpatrick, P., & Natale, L.
(2006) YARP: Yet another robot platform. International Journal of Advanced Robotic Systems, 3(1), 43–48. DOI logoGoogle Scholar
Metta, G., Sandini, G., Vernon, D., Natale, L., & Nori, F.
(2008) The iCub humanoid robot: An open platform for research in embodied cognition. PerMIS ’08: Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, 50–56. DOI logoGoogle Scholar
Moon, A., Panton, B., Van der Loos, H. F. M., & Croft, E. A.
(2010) Using hesitation gestures for safe and ethical human-robot interaction. IEEE Conference on Robotics and Automation: Workshop on Interactive Communication for Autonomous Intelligent Robots, 1–3.Google Scholar
Moon, A., Parker, C. A. C., Croft, E. A., & Van der Loos, H. F. M.
(2013) Design and impact of hesitation gestures during human-robot resource conflicts. Journal of Human-Robot Interaction, 2(3), 18–40. DOI logoGoogle Scholar
Muto, Y., Takasugi, S., Yamamoto, T., & Miyake, Y.
(2009) Timing control of utterance and gesture in interaction between human and humanoid robot. RO-MAN 2009 – The 18th IEEE International Symposium on Robot and Human Interactive Communication, 1022–1028. DOI logoGoogle Scholar
Nehaniv, C. L., Dautenhahn, K., Kubacki, J., Haegele, M., Parlitz, C., & Alami, R.
(2005) A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. ROMAN2005. IEEE International Workshop on Robot and Human Interactive Communication, 371–377. DOI logoGoogle Scholar
Normoyle, A., Badler, J. B., Fan, T., Badler, N. I., Cassol, V. J., & Musse, S. R.
(2013) Evaluating perceived trust from procedurally animated gaze. MIG ’13: Proceedings of Motion on Games, 119–126. DOI logoGoogle Scholar
Paolacci, G., Chandler, J., & Ipeirotis, P. G.
(2010) Running experiments on Amazon Mechanical Turk. Judgment and Decision Making, 5(5), 411–419.Google Scholar
Peters, R., Broekens, J., & Neerincx, M. A.
(2017) Robots educate in style: The effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 449–455. DOI logoGoogle Scholar
Pitsch, K., Lohan, K. S., Rohlfing, K., Saunders, J., Nehaniv, C. L., & Wrede, B.
(2012) Better be reactive at the beginning. Implications of the first seconds of an encounter for the tutoring style in human-robot-interaction. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, 974–981. DOI logoGoogle Scholar
Robert Jr., L. P., Alahmad, R., Esterwood, C., Kim, S., You, S., & Zhang, Q.
(2020) A review of personality in human-robot interactions. Foundations and Trends in Information Systems, 4(2), 107–212. DOI logoGoogle Scholar
Robins, B., Dautenhahn, K., Nehaniv, C. L., Mirza, N. A., Franҫois, D., & Olsson, L.
(2005) Sustaining interaction dynamics and engagement in dyadic child-robot interaction kinesics: Lessons learnt from an exploratory study. ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005 1., 716–722. DOI logoGoogle Scholar
Saerbeck, M., & Bartneck, C.
(2010) Perception of affect elicited by robot motion. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 53–60. DOI logoGoogle Scholar
Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., & Joublin, F.
(2012) Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4(2), 201–217. DOI logoGoogle Scholar
Saunderson, S., & Nejat, G.
(2019) How robots influence humans: A survey of nonverbal communication in social human-robot interaction. International Journal ofSocial Robotics, 11(4), 575–608. DOI logoGoogle Scholar
Treiblmaier, H., & Filzmoser, P.
(2011) Benefits from using continuous rating scales in online survey research. Thirty Second International Conference on Information Systems, 2087–2099. DOI logoGoogle Scholar
Vannucci, F., Di Cesare, G., Rea, F., Sandini, G., & Sciutti, A.
(2018) A robot with style: Can robotic attitudes influence human actions? 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids), 952–957. DOI logoGoogle Scholar
Venture, G., & Kulić, D.
(2019) Robot expressive motions: A survey of generation and evaluation methods. ACM Transactions on Human-Robot Interaction, 8(4). DOI logoGoogle Scholar
Wallkötter, S., Stower, R., Kappas, A., & Castellano, G.
(2020) A Robot by Any Other Frame: Framing and Behaviour Influence Mind Perception in Virtual but not Real-World Environments. HRI ’20: ACM/IEEE International Conference on Human-Robot Interaction, 609–618. DOI logoGoogle Scholar
Whiten, A., & van de Waal, E.
(2018) The pervasive role of social learning in primate lifetime development. Behavioral Ecology and Sociobiology, 72(5), 80. DOI logoGoogle Scholar
Wrede, B., Rohlfing, K. J., Hanheide, M., & Sagerer, G.
(2009) Towards learning by interacting. Creating brain-like intelligence (pp. 139–150). Springer. DOI logoGoogle Scholar
Cited by (2)

Cited by 2 other publications

Aliasghari, Pourya, Moojan Ghafurian, Chrystopher L. Nehaniv & Kerstin Dautenhahn
2022. Kinesthetic Teaching of a Robot over Multiple Sessions: Impacts on Speed and Success. In Social Robotics [Lecture Notes in Computer Science, 13818],  pp. 160 ff. DOI logo
Aliasghari, Pourya, Moojan Ghafurian, Chrystopher L. Nehaniv & Kerstin Dautenhahn
2023. How Do We Perceive Our Trainee Robots? Exploring the Impact of Robot Errors and Appearance When Performing Domestic Physical Tasks on Teachers’ Trust and Evaluations. ACM Transactions on Human-Robot Interaction 12:3  pp. 1 ff. DOI logo

This list is based on CrossRef data as of 4 july 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.