Cited by

Cited by 17 other publications

Amini, Reza, Christine Lisetti & Guido Ruiz
2015. HapFACS 3.0: FACS-Based Facial Expression Generator for 3D Speaking Virtual Characters. IEEE Transactions on Affective Computing 6:4  pp. 348 ff. DOI logo
Benyon, David, Bjorn Gamback, Preben Hansen, Oli Mival & Nick Webb
2013. How Was Your Day? Evaluating a Conversational Companion. IEEE Transactions on Affective Computing 4:3  pp. 299 ff. DOI logo
Ding, Yu, Jing Huang & Catherine Pelachaud
2017. Audio-Driven Laughter Behavior Controller. IEEE Transactions on Affective Computing 8:4  pp. 546 ff. DOI logo
Ding, Yu, Catherine Pelachaud & Thierry Artières
2013. Modeling Multimodal Behaviors from Speech Prosody. In Intelligent Virtual Agents [Lecture Notes in Computer Science, 8108],  pp. 217 ff. DOI logo
Ding, Yu, Mathieu Radenen, Thierry Artieres & Catherine Pelachaud
2013. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing,  pp. 3756 ff. DOI logo
D’Errico, Francesca & Isabella Poggi
2016. Social Emotions. A Challenge for Sentiment Analysis and User Models. In Emotions and Personality in Personalized Services [Human–Computer Interaction Series, ],  pp. 13 ff. DOI logo
Jain, Nishant & Gaurav Goel
2020. 2020 8th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO),  pp. 768 ff. DOI logo
Kopp, Stefan & Teena Hassan
2022. The Fabric of Socially Interactive Agents: Multimodal Interaction Architectures. In The Handbook on Socially Interactive Agents,  pp. 77 ff. DOI logo
Lala, Divesh & Toyoaki Nishida
2017. A data-driven passing interaction model for embodied basketball agents. Journal of Intelligent Information Systems 48:1  pp. 27 ff. DOI logo
Langlet, Caroline & Chloe Clavel
2015. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII),  pp. 14 ff. DOI logo
Ochs, Magalie & Catherine Pelachaud
2013. Socially Aware Virtual Characters: The Social Signal of Smiles [Social Sciences]. IEEE Signal Processing Magazine 30:2  pp. 128 ff. DOI logo
Ochs, Magalie, Catherine Pelachaud & Gary Mckeown
2017. A User Perception--Based Approach to Create Smiling Embodied Conversational Agents. ACM Transactions on Interactive Intelligent Systems 7:1  pp. 1 ff. DOI logo
Polceanu, Mihai
2013. 2013 IEEE Conference on Computational Inteligence in Games (CIG),  pp. 1 ff. DOI logo
Pérez, Joaquín, Yanet Sánchez, Francisco J. Serón & Eva Cerezo
2017. Interacting with a Semantic Affective ECA. In Intelligent Virtual Agents [Lecture Notes in Computer Science, 10498],  pp. 374 ff. DOI logo
Shidara, Kazuhiro, Hiroki Tanaka, Hiroyoshi Adachi, Daisuke Kanayama, Takashi Kudo & Satoshi Nakamura
2024. Adapting the Number of Questions Based on Detected Psychological Distress for Cognitive Behavioral Therapy With an Embodied Conversational Agent: Comparative Study. JMIR Formative Research 8  pp. e50056 ff. DOI logo
Sánchez-López, Yanet & Eva Cerezo
2019. Designing emotional BDI agents: good practices and open questions. The Knowledge Engineering Review 34 DOI logo
Thézé, Raphaël, Mehdi Ali Gadiri, Louis Albert, Antoine Provost, Anne-Lise Giraud & Pierre Mégevand
2020. Animated virtual characters to explore audio-visual speech in controlled and naturalistic environments. Scientific Reports 10:1 DOI logo

This list is based on CrossRef data as of 24 march 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.