In this chapter we present our work toward building a conversational Companion. Conversing with partner(s) means to being able to express one’s mental and emotional state, to be a speaker or a listener. One needs also to adapt to ones partner’s reactions to what one is saying. We have developed an interactive ECA platform, Greta (Pelachaud, 2005). It is a 3D virtual agent capable of communicating expressive verbal and nonverbal behaviors as well as listening. It can use its gaze, facial expressions and gestures to convey a meaning, an attitude or an emotion. Multimodal behaviors are tightly tied with each other. A synchronization scheme has been elaborated allowing the agent to display a raised eyebrow or a beat gesture on a given word. According to its emotional or mental state, the agent may vary the quality of its behaviors: it may use a more or less extended gesture, the arms can move at different speeds and with different accelerations (Mancini & Pelachaud, 2008). The agent can also display listener behavior (Bevacqua et al., 2008). It interacts actively with users and/or other agents providing appropriate timed backchannels. Interaction also means the interactants ought to adapt to each others’ behaviors and dynamic coupling between them needs to be considered (Prepin & Revel, 2007).
2013. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, ► pp. 3756 ff.
D’Errico, Francesca & Isabella Poggi
2016. Social Emotions. A Challenge for Sentiment Analysis and User Models. In Emotions and Personality in Personalized Services [Human–Computer Interaction Series, ], ► pp. 13 ff.
Jain, Nishant & Gaurav Goel
2020. 2020 8th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), ► pp. 768 ff.
Kopp, Stefan & Teena Hassan
2022. The Fabric of Socially Interactive Agents: Multimodal Interaction Architectures. In The Handbook on Socially Interactive Agents, ► pp. 77 ff.
Lala, Divesh & Toyoaki Nishida
2017. A data-driven passing interaction model for embodied basketball agents. Journal of Intelligent Information Systems 48:1 ► pp. 27 ff.
Langlet, Caroline & Chloe Clavel
2015. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), ► pp. 14 ff.
Ochs, Magalie & Catherine Pelachaud
2013. Socially Aware Virtual Characters: The Social Signal of Smiles [Social Sciences]. IEEE Signal Processing Magazine 30:2 ► pp. 128 ff.
Ochs, Magalie, Catherine Pelachaud & Gary Mckeown
2017. A User Perception--Based Approach to Create Smiling Embodied Conversational Agents. ACM Transactions on Interactive Intelligent Systems 7:1 ► pp. 1 ff.
Polceanu, Mihai
2013. 2013 IEEE Conference on Computational Inteligence in Games (CIG), ► pp. 1 ff.
Pérez, Joaquín, Yanet Sánchez, Francisco J. Serón & Eva Cerezo
2017. Interacting with a Semantic Affective ECA. In Intelligent Virtual Agents [Lecture Notes in Computer Science, 10498], ► pp. 374 ff.
2024. Adapting the Number of Questions Based on Detected Psychological Distress for Cognitive Behavioral Therapy With an Embodied Conversational Agent: Comparative Study. JMIR Formative Research 8 ► pp. e50056 ff.
Sánchez-López, Yanet & Eva Cerezo
2019. Designing emotional BDI agents: good practices and open questions. The Knowledge Engineering Review 34
Thézé, Raphaël, Mehdi Ali Gadiri, Louis Albert, Antoine Provost, Anne-Lise Giraud & Pierre Mégevand
2020. Animated virtual characters to explore audio-visual speech in controlled and naturalistic environments. Scientific Reports 10:1
This list is based on CrossRef data as of 24 march 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.