Article published in:
Social Cues in Robot Interaction, Trust and Acceptance
Edited by Alessandra Rossi, Kheng Lee Koay, Silvia Moros, Patrick Holthaus and Marcus Scheunemann
[Interaction Studies 20:3] 2019
► pp. 487508


App, B., McIntosh, D. N., Reed, C. L., and Hertenstein, M. J.
(2011) Nonverbal channel use in communication of emotion: How may depend on why. Emotion, 11(3):603–617. CrossrefGoogle Scholar
App, B., Reed, C. L., and McIntosh, D. N.
(2012) Relative contributions of face and body configurations: Perceiving emotional state and motion intention. Cognition and Emotion, 26(4):690–698. CrossrefGoogle Scholar
Bartneck, C., Reichenbach, J., and Van Breemen, A.
(2004) In your face, robot! the influence of a characters embodiment on how users perceive its emotional expressions. In Design and Emotion.Google Scholar
Beck, A., Cañamero, L., Hiolle, A., Damiano, L., Cosi, P., Tesser, F., and Sommavilla, G.
(2013) Interpretation of emotional body language displayed by a humanoid robot: A case study with children. International Journal of Social Robotics, 5(3):325–334. CrossrefGoogle Scholar
Beck, A., Hiolle, A., Mazel, A., and Cañamero, L.
(2010) Interpretation of emotional body language displayed by robots. In Proceedings of the 3rd International Workshop on Affective Interaction in Natural Environments, AFFINE ’10, pages 37–42, New York, NY, USA. ACM.Google Scholar
Biele, C. and Grabowska, A.
(2006) Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research, 171(1):1–6. CrossrefGoogle Scholar
Breazeal, C.
(2001) Emotive qualities in robot speech. In Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, volume 3, pages 1388–1394.Google Scholar
[ p. 506 ]
Breazeal, C. and Scassellati, B.
(1999) How to build robots that make friends and influence people. In Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), volume 2, pages 858–863 vol.2.Google Scholar
Burattini, E. and Rossi, S.
(2010) Periodic activations of behaviours and emotional adaptation in behaviour-based robotics. Connect. Sci, 22(3):197–213. CrossrefGoogle Scholar
Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., and Young, A. W.
(2003) Facial expression recognition across the adult life span. Neuropsychologia, 41(2):195–202. The cognitive neuroscience of social behavior. CrossrefGoogle Scholar
Calvo, M. G. and Nummenmaa, L.
(2016) Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6):1081–1106. CrossrefGoogle Scholar
Conti, D., Carla, C., Di Nuovo, S., et al.
(2019) robot, tell me a tale!: A social robot as tool for teachers in kindergarten. Interaction Studies, 20(2):1–16.Google Scholar
Ekman, P.
(1992) An argument for basic emotions. Cognition & emotion, 6(3–4):169–200. CrossrefGoogle Scholar
Haring, M., Bee, N., and Andr, E.
(2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In RO-MAN, pages 204–209.Google Scholar
Jack, R. E., Garrod, O. G., and Schyns, P. G.
(2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biology, 24(2):187–192. CrossrefGoogle Scholar
Kleinsmith, A. and Bianchi-Berthouze, N.
(2013) Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1):15–33. CrossrefGoogle Scholar
Leite, I.
(2015) Long-term interactions with empathic social robots. AI Matters, 1(3):13–15. CrossrefGoogle Scholar
Li, X., MacDonald, B., and Watson, C. I.
(2009) Expressive facial speech synthesis on a robotic platform. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’09, pages 5009–5014. IEEE Press. CrossrefGoogle Scholar
Lim, A., Ogata, T., and Okuno, H. G.
(2012) The desire model: Cross-modal emotion analysis and expression for robots. CrossrefGoogle Scholar
Marmpena, M., Lim, A., and Dahl, T. S.
(2017) How does the robot feel? annotation of emotional expressions generated by a humanoid robot with affective quantifiers. In Proceedings of the 2017 Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics (BAILAR – IEEE RO-MAN2017).Google Scholar
(2018) How does the robot feel? perception of valence and arousal in emotional body language. Paladyn, Journal of Behavioral Robotics, 9(1):168–182. CrossrefGoogle Scholar
McColl, D. and Nejat, G.
(2014) Recognizing emotional body language displayed by a humanlike social robot. International Journal of Social Robotics, 6(2):261–280. CrossrefGoogle Scholar
Moltchanova, E. and Bartneck, C.
(2017) Individual differences are more important than the emotional category for the perception of emotional expressions. Interaction Studies, 18(2):161–173. CrossrefGoogle Scholar
[ p. 507 ]
Mutlu, B., Yamaoka, F., Kanda, T., Ishiguro, H., and Hagita, N.
(2009) Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior. In 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 69–76.Google Scholar
Nijdam, N. A.
(2009) Mapping emotion to color. Book Mapping emotion to color, pages 2–9.Google Scholar
Ortony, A., Clore, G. L., and Collins, A.
(1990) The cognitive structure of emotions. Cambridge university press.Google Scholar
Pereira, A., Leite, I., Mascarenhas, S., Martinho, C., and Paiva, A.
(2011) Using empathy to improve human-robot relationships. In Human-Robot Personal Relationships, pages 130–138, Berlin, Heidelberg. Springer Berlin Heidelberg. CrossrefGoogle Scholar
Rosenthal-von der Pütten, A. M., Krämer, N. C., and Herrmann, J.
(2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5):569–582. CrossrefGoogle Scholar
Rossi, S., Ferland, F., and Tapus, A.
(2017) User profiling and behavioral adaptation for hri: A survey. Pattern Recognition Letters, 99(Supplement C):3–12. User Profiling and Behavior Adaptation for Human-Robot Interaction. CrossrefGoogle Scholar
Rossi, S., Staffa, M., and Tamburro, A.
(2018) Socially assistive robot for providing recommendations: Comparing a humanoid robot with a mobile application. International Journal of Social Robotics, 10(2):265–278. CrossrefGoogle Scholar
Russell, J. A.
(1980) A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161–1178. CrossrefGoogle Scholar
Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., and Joublin, F.
(2011) Effects of gesture on the perception of psychological anthropomorphism: A case study with a humanoid robot. In Social Robotics, pages 31–41, Berlin, Heidelberg. Springer Berlin Heidelberg. CrossrefGoogle Scholar
Scherer, K. R., Schorr, A., and Johnstone, T.
(2001) Appraisal processes in emotion: Theory, methods, research. Oxford University Press.Google Scholar
Schlosberg, H.
(1954) Three dimensions of emotion. Psychological review, 61(2):81. CrossrefGoogle Scholar
Song, S. and Yamada, S.
(2017a) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, pages 2–11, New York, NY, USA. ACM.Google Scholar
(2017b) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pages 2–11. ACM.Google Scholar
Tonks, J., Williams, W. H., Frampton, I., Yates, P., and Slater, A.
(2007) Assessing emotion recognition in 915-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Injury, 21(6):623–629. CrossrefGoogle Scholar
Tsiourti, C., Weiss, A., Wac, K., and Vincze, M.
(2017) Designing emotionally expressive robots: A comparative study on the perception of communication modalities. In Proceedings of the 5th International Conference on Human Agent Interaction, HAI ’17, pages 213–222, New York, NY, USA. ACM.Google Scholar
Valdez, P. and Mehrabian, A.
(1994) Effects of color on emotions. Journal of experimental psychology: General, 123(4):394. CrossrefGoogle Scholar
[ p. 508 ]
Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., and Sommer, W.
(2014) Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 5:404.Google Scholar
Wundt, W. M.
(1907) Outlines of psychology. W. Engelmann.Google Scholar
Xu, J., Broekens, J., Hindriks, K., and Neerincx, M. A.
(2014) Robot mood is contagious: Effects of robot body language in the imitation game. In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems, AAMAS ’14, pages 973–980, Richland, SC. International Foundation for Autonomous Agents and Multiagent Systems.Google Scholar
Cited by

Cited by 5 other publications

Liu, Dong, Zhiyong Wang, Lifeng Wang & Longxi Chen
2021. Multi-Modal Fusion Emotion Recognition Method of Speech Expression Based on Deep Learning. Frontiers in Neurorobotics 15 Crossref logo
Rossi, Silvia, Teresa Cimmino, Marco Matarese & Mario Raiano
2019.  In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN),  pp. 1 ff. Crossref logo
Rossi, Silvia, Elena Dell’Aquila & Benedetta Bucci
2019.  In Social Robotics [Lecture Notes in Computer Science, 11876],  pp. 505 ff. Crossref logo
Rossi, Silvia, Marwa Larafa & Martina Ruocco
2020. Emotional and Behavioural Distraction by a Social Robot for Children Anxiety Reduction During Vaccination. International Journal of Social Robotics 12:3  pp. 765 ff. Crossref logo
Spezialetti, Matteo, Giuseppe Placidi & Silvia Rossi
2020. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Frontiers in Robotics and AI 7 Crossref logo

This list is based on CrossRef data as of 14 september 2021. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.