Robots should be able to represent emotional states to interact with people as social agents. There are cases where robots cannot have bio-inspired bodies, for instance because the task to be performed requires a special shape, as in the case of home cleaners, package carriers, and many others. In these cases, emotional states have to be represented by exploiting movements of the body. In this paper, we present a set of case studies aimed at identifying specific values to convey emotion trough changes in linear and angular velocities, which might be applied on different non-anthropomorphic bodies. This work originates from some of the most considered emotion expression theories and from emotion coding for people. We show that people can recognize some emotional expressions better than others, and we propose some directions to express emotions exploiting only bio-neutral movement.
Angel-Fernandez, J. M., & Bonarini, A. (2016). Identifying values to express emotions with a non-anthropomorphic platform. (Manuscript in preparation)
Barakova, E. I., & Lourens, T. (2010). Expressing and interpreting emotional movements in social games with robots. Personal and Ubiquitous Computing, 14(5), 457–467.
Beck, A., Cañamero, L., & Bard, K. (2010). Towards an affect space for robots to display emotional body language. In Ieee roman conference.
Beck, A., Hiolle, A., Mazel, A., & Cañamero, L. (2010). Interpretation of emotional body language displayed by robots. In 3rd acm workshop on affective interaction in natural environments (p. 37–42).
Breazeal, C. (2002). Designing sociable robots. Cambridge, MA, USA: MIT Press.
Brown, L., & Howard, A. M. (2014, Aug). Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In Robot and human interactive communication, 2014 ro-man: The 23rd ieee international symposium on robot and human interactive communication (p. 471–476).
Cacioppo, J., Tassinary, L., & Berntson, G. (2000). Handbook of psychophysiology. University Press.
Crane, E., & Gross, M. (2013). Effort-shape characteristics of emotion-related body movement. Journal of Nonverbal Behavior, 37(2), 91–105.
Destephe, M., Zecca, M., Hashimoto, K., & Takanishi, A. (2013, Dec). Perception of emotion and emotional intensity in humanoid robots gait. In Robotics and biomimetics (robio), 2013 ieee international conference on (p. 1276–1281).
Ekman, P. (2004). Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Owl Books.
Embgen, S., Luber, M., Becker-Asano, C., Ragni, M., Evers, V., & Arras, K. O. (2012, September). Robot-specific social cues in emotional body language. In Robot and human interactive communication, 2012 ro-man: The 21st ieee international symposium on robot and human interactive communication (pp. 1019–1025). USA: IEEE Computer Society.
from Jed Wing, M. K. C., Weston, S., Williams, A., Keefer, C., & Engelhardt, A. (2012). caret: Classification and regression training [Computer software manual]. Retrieved from [URL] (R package version 51.15–044)
Hayes, A. F., & Krippendorff, K. (2007). Answering the call for a standard reliability measure for coding data. Communication Methods and Measures, 1(1), 77–89.
Hiah, L., Beursgens, L., Haex, R., Romero, L. P., Teh, Y., ten Bhömer, M., … Barakova, E. I. (2013). Abstract robots with an attitude: Applying interpersonal relation models to human-robot interaction. In Robot and human interactive communication, 2013 ro-man: The 22nd ieee international symposium on robot and human interactive communication, gyeongju, korea (south), august 26–29, 2013 (pp. 37–44).
Kaya, N., & Epps, H. H. (2004, September01). Relationship between color and emotion: a study of college students. College student journal, 38(3). Retrieved from [URL]
Kleinsmith, A., & Bianchi-Berthouze, N. (2013). Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1), 15–33.
Laban, R., & Ullmann, L. (1968). Modern educational dance (2d ed., rev. by Lisa Ullmann. ed.) [Book]. Praeger New York.
Lakatos, G., Gacsi, M., Konok, V., Bruder, I., Berczky, B., Korondi, P., & Miklosi, A. (2014, December). Emotion attribution to a non-humanoid robot in different social situations. PLos ONE.
Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (2008). International affective picture system (IAPS): Affective ratings of pictures and instruction manual (Tech. Rep. No. A-8). Gainesville, FL: The Center for Research in Psychophysiology, University of Florida. Retrieved from [URL]
Li, J., & Chignell, M. H. (2011). Communication of emotion in social robots through simple head and arm movements. I. J. Social Robotics, 3(2), 125–142.
Marsella, S., Gratch, J., & Petta, P. (2010). A blueprint for affective computing. In (p. 21–46). Oxford University Press.
Nam, T.-J., Lee, J.-H., Park, S., & Suk, H.-J. (2014). Understanding the relation between emotion and physical movements. International Journal of Affective Engineering, 13(3), 217–226.
Novikova, J., & Watts, L. (2015). Towards artificial emotions to assist social coordination in hri. International Journal of Social Robotics, 7(1), 77–88.
Plutchik, R. (2001). The Nature of Emotions. American Scientist, 89(4), 344+.
Pratto, F., Sidanius, J., Stallworth, L. M., & Malle, B. F. (1994). Social dominance orientation: A personality variable predicting social and political attitudes. Journal of personality and social psychology, 67(4), 741.
Roether, C. L., Omlor, L., Christensen, A., & Giese, M. A. (2009). Critical features for the perception of emotion from gait. Journal of Vision, 9(6), 15.
Russell, J. A., Bachorowski, J. A., & Dols, J. M. F. (2003). Facial and Vocal Expressions of Emotion. Annual Review of Psychology, 54(1), 329–349. Retrieved from
Saerbeck, M., & Bartneck, C. (2010). Perception of affect elicited by robot motion. In Proceedings of the 5th acm/ieee international conference on human-robot interaction (pp. 53–60). Piscataway, NJ, USA: IEEE Press.
Saerbeck, M., & van Breemen, A. J. N. (2007). Design guidelines and tools for creating believable motion for personal robots. In Ro-man (p. 386–391).
Sharma, M., Hildebrandt, D., Newman, G., Young, J. E., & Eskicioglu, R. (2013). Communicating affect via flight path: Exploring use of the laban effort system for designing affective locomotion paths. In Proceedings of the 8th acm/ieee international conference on human-robot interaction (pp. 293–300). Piscataway, NJ, USA: IEEE Press.
Venture, G., Kadone, H., Zhang, T., Grèzes, J., Berthoz, A., & Hicheur, H. (2014). Recognizing emotions conveyed by human gait. International Journal of Social Robotics, 6(4), 621–632.
Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: the panas scales. Journal of Personality and Social Psychology, 541, 1063–1070.
Cited by (8)
Cited by eight other publications
Ali, Sara, Faisal Mehmood, Khawaja Fahad Iqbal, Yasar Ayaz, Muhammad Sajid, Muhammad Baber Sial, Muhammad Faiq Malik & Kashif Javed
2023. 2023 3rd International Conference on Artificial Intelligence (ICAI), ► pp. 39 ff.
Bonarini, Andrea
2022. Proceedings of the 10th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion, ► pp. 91 ff.
Park, Soomi, Patrick G. T. Healey & Antonios Kaniadakis
2021. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, ► pp. 1 ff.
Hoggenmueller, Marius, Jiahao Chen & Luke Hespanhol
2020. Proceedings of the 9TH ACM International Symposium on Pervasive Displays, ► pp. 87 ff.
Gracia, Luis, J. Ernesto Solanes, Pau Muñoz-Benavent, Jaime Valls Miro, Carlos Perez-Vidal & Josep Tornero
2018. 2018 3rd International Conference on Control and Robotics Engineering (ICCRE), ► pp. 11 ff.
Angel-Fernandez, Julian M. & Andrea Bonarini
2017. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), ► pp. 424 ff.
This list is based on CrossRef data as of 18 october 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.