Individual differences are more important than the emotional category for the perception of emotional expressions
Emotional facial expression are an important communication channel between artificial characters and their users. Humans are trained to perceive emotions. Robots and virtual agents can use them to make their inner states transparent. Literature reported that some emotional types, such as anger, are perceived as being more intense than others. Other studies indicated that gender influences the perception. Our study shows that once the individual differences amongst participants are included in the statistical analysis, then the emotion type has no further explanatory power. Artificial characters therefore should adapt to their specific users.
Article outline
- 1.Introduction
- 2.Method
- 2.1Participants
- 2.2Process
- 2.3Stimuli
- 2.4Measurements
- 3.Results
- 4.Statistical Methods
- 5.Discussion
-
References
References (25)
References
Agresti, A.. (2010). Analysis of ordinal categorical data. Hoboken, NJ: Wiley.
Bartneck, C., Obaid, M., & Zawieska, K.. (2013). Agents with faces – what can we learn from lego minfigures. In 1st international conference on
human-agent interaction
(pp. III–2–1). Retrieved from [URL]
Bartneck, C., & Reichenbach, J.. (2005). Subtle emotional expressions of synthetic characters. The international Journal of Human-Computer Studies, 62(2), 179–192.
Bartneck, C., Reichenbach, J., & Breemen, A.. (2004). In your face, robot! the influence of a character’s embodiment on how users perceive its emotional ex-pressions. In Proceedings of the
design and emotion 2004 conference
. Retrieved from [URL]
Biele, C., & Grabowska, A.. (2006). Sex differences in perception of emotion intensity in dymanic and static facial expressions. Exp Brain Res, 1711, 1–6.
Breazeal, C.. (2003). Emotion and sociable humanoid robots. International Journal of Human-Computer Studies, 59(1–2), 119–155. Retrieved from [URL] (Applications of Affective Computing in Human-Computer Interaction)
Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., & Young, A.. (2003). Facial expression recognition across the adult life span. Neuropsychologia, 411, 195–202.
Calvo, M. G., & Nummenmaa, L.. (2016). Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6), 1081–1106.
Christensen, R. H. B.. (2015). ordinal—regression models for ordinal data. (R package version 2015.6–28. [URL])
Fong, T., Nourbakhsh, I., & Dautenhahn, K.. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166. Retrieved from [URL] (Socially Interactive Robots)
Hess, U., Blairy, S., & Kleck, R. E.. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21(4), 241–257.
Hwang, J., Park, T., & Hwang, W.. (2013). The effects of overall robot shape on the emotions invoked in users and the perceived personalities of robot. Applied Ergonomics, 44(3), 459–471. Retrieved from [URL].
Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., & Akamatsu, S.. (2013). Dynamic properties influence the perception of facial expressions. Perception, 42 (11), 1266–1278. Retrieved from [URL].
Martina Mittlböck, M., & Shemper, M.. (1996). Explained variation for logistic regression. Statistics in Medicine, 1987–1997.
Matsumoto, D., & Ekman, P.. (1989). American-japanese cultural differences in intensity ratings of facial expressions of emotion. Motivation and Emotion, 131, 143–157.
McClure, E. B.. (2000). A meta-analytic review of sex differences in facial expression processing and their development in infants, children, and adolescents. Psychological Bulletin, 126(3), 424–253.
McColl, D., & Nejat, G.. (2014). Recognizing emotional body language displayed by a human-like social robot. International Journal of Social Robotics, 6(2), 261–280.
Menard, S.. (2000). Coefficients of determination for multiple logistic regression analysis. The American Statistician, 17–24.
R. Core Team. (2014). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from [URL]
Rapcsak, S., Galper, S., Comer, J., Reminger, S., Nielsen, L., & Kaszniak, A.. (2000). Fear recognition deficits after focal brain damage—a cautionary note. Neurology, 541, 575–581.
Russel, J. A.. (1994). Is there universal recognition of emotion from facial expression? a review of the cross-cultural studies. Psychological Bulletin, 1151, 102–141.
Sonnemans, J., & Frijda, N.. (1994). The structure of subjective emotional intensity. Cognition and Emotion, 81, 329–350.
Sonnemans, J., & Frijda, N. H.. (1995). The determinants of subjective emotional intensity. Cognition and Emotion, 91, 483–506.
Suzuki, A., Hoshino, T., & Shigemasu, K.. (2006). Measuring individual differences in sensitivities to basic emotions in faces. Cognition 991, 327–353.
Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., & Sommer, W.. (2014). Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 51, 404. Retrieved from [URL].
Cited by (2)
Cited by two other publications
Numata, Takashi, Yasuhiro Asa, Takaaki Hashimoto & Kaori Karasawa
2024.
Young and old persons' subjective feelings when facing with a non-human computer-graphics-based agent's emotional responses in consideration of differences in emotion perception.
Frontiers in Computer Science 6
Rossi, Silvia & Martina Ruocco
This list is based on CrossRef data as of 4 july 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.