Individual differences are more important than the emotional category for the perception of emotional expressions
Elena Moltchanova | University of Canterbury
Christoph Bartneck | University of Canterbury
Emotional facial expression are an important communication channel between artificial characters and their users. Humans are trained to perceive emotions. Robots and virtual agents can use them to make their inner states transparent. Literature reported that some emotional types, such as anger, are perceived as being more intense than others. Other studies indicated that gender influences the perception. Our study shows that once the individual differences amongst participants are included in the statistical analysis, then the emotion type has no further explanatory power. Artificial characters therefore should adapt to their specific users.
Keywords: emotion, expression, gender, individual, differences
Article outline
- 1.Introduction
- 2.Method
- 2.1Participants
- 2.2Process
- 2.3Stimuli
- 2.4Measurements
- 3.Results
- 4.Statistical Methods
- 5.Discussion
-
References
Published online: 08 December 2017
https://doi.org/10.1075/is.18.2.01mol
https://doi.org/10.1075/is.18.2.01mol
References
Bartneck, C., Obaid, M., & Zawieska, K.
(2013) Agents with faces – what can we learn from lego minfigures. In 1st international conference on
human-agent interaction
(pp. III–2–1). Retrieved from http://hai-conference.net/ihai2013/proceedings/pdf/III-2-1.pdf
Bartneck, C., & Reichenbach, J.
Bartneck, C., Reichenbach, J., & Breemen, A.
(2004) In your face, robot! the influence of a character’s embodiment on how users perceive its emotional ex-pressions. In Proceedings of the
design and emotion 2004 conference
. Retrieved from http://www.bartneck.de/publications/2004/inYourFaceRobot/bartneckDE2004.pdf
Biele, C., & Grabowska, A.
Breazeal, C.
(2003) Emotion and sociable humanoid robots. International Journal of Human-Computer Studies, 59(1–2), 119–155. Retrieved from http://www.sciencedirect.com/science/article/pii/S1071581903000181 (Applications of Affective Computing in Human-Computer Interaction) 

Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., & Young, A.
Calvo, M. G., & Nummenmaa, L.
Christensen, R. H. B.
(2015) ordinal—regression models for ordinal data. (R package version 2015.6–28. http://www.cran.r-project.org/package=ordinal/)
Fong, T., Nourbakhsh, I., & Dautenhahn, K.
(2003) A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166. Retrieved from http://www.sciencedirect.com/science/article/pii/S092188900200372X (Socially Interactive Robots) 

Hess, U., Blairy, S., & Kleck, R. E.
Hwang, J., Park, T., & Hwang, W.
(2013) The effects of overall robot shape on the emotions invoked in users and the perceived personalities of robot. Applied Ergonomics, 44(3), 459–471. Retrieved from http://www.sciencedirect.com/science/article/pii/S0003687012001688. DOI: 

Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., & Akamatsu, S.
(2013) Dynamic properties influence the perception of facial expressions. Perception, 42 (11), 1266–1278. Retrieved from http://pec.sagepub.com/content/42/11/1266.abstract. DOI: 

Martina Mittlböck, M., & Shemper, M.
Matsumoto, D., & Ekman, P.
McClure, E. B.
McColl, D., & Nejat, G.
Menard, S.
R. Core Team
(2014) R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project.org/
Rapcsak, S., Galper, S., Comer, J., Reminger, S., Nielsen, L., & Kaszniak, A.
Russel, J. A.
Sonnemans, J., & Frijda, N.
Sonnemans, J., & Frijda, N. H.
Suzuki, A., Hoshino, T., & Shigemasu, K.
Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., & Sommer, W.
(2014) Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 51, 404. Retrieved from http://journal.frontiersin.org/article/10.3389/fpsyg.2014.00404. DOI: 

Cited by
Cited by 1 other publications
Rossi, Silvia & Martina Ruocco
This list is based on CrossRef data as of 15 april 2022. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.