Article published In:
Vocal Interactivity in-and-between Humans, Animals and Robots
Edited by Mohamed Chetouani, Elodie F. Briefer, Angela Dassow, Ricard Marxer, Roger K. Moore, Nicolas Obin and Dan Stowell
[Interaction Studies 24:1] 2023
► pp. 130167
References
Augustine, A. C., Ryusuke, M., Liu, C., Ishi, C. T., & Ishiguro, H.
(2020) Generation and evaluation of audio-visual anger emotional expression for android robot. Companion of the 2020 ACM/IEEE International Conference on Human–Robot Interaction, 96–98. DOI logoGoogle Scholar
Barchard, K. A., Lapping-Carr, L., Westfall, R. S., Fink-Armold, A., Banisetty, S. B., & Feil-Seifer, D.
(2020) Measuring the perceived social intelligence of robots. ACM Transactions on Human–Robot Interaction, 9 (4). DOI logoGoogle Scholar
Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S.
(2009) Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int J Soc Robot, 1 1, 71–81. DOI logoGoogle Scholar
Bates, D., Machler, M., Bolker, B., & Walker, S.
(2015) Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67 (1), 1–48. DOI logoGoogle Scholar
Brandl, C., Mertens, A., & Schlick, C. M.
(2016) Human–Robot Interaction in Assisted Personal Services: Factors Influencing Distances That Humans Will Accept between Themselves and an Approaching Service Robot. Human Factors and Ergonomics in Manufacturing & Service Industries, 26 (6), 713–727. DOI logoGoogle Scholar
Breazeal, C., Kidd, C., Thomaz, A., Hoffman, G., & Berlin, M.
(2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 708–713. DOI logoGoogle Scholar
Campbell, N., & Mokhtari, P.
(2003) Voice quality: The 4th prosodic dimension. Proc. 15th Int. Congr. Phonetic Sciences, pp. 2417–2420.Google Scholar
Carpenter, J.
(2013) The Quiet Professional: An investigation of U.S. military Explosive Ordnance Disposal personnel interactions with everyday field robots. Doctoral dissertation University of Washington.
Carpinella, C. M., Wyman, A. B., Perez, M. A., & Stroessner, S. J.
(2017) The Robotic Social Attributes Scale (RoSAS): Development and Validation. 2017 12th ACM/IEEE International Conference on Human–Robot Interaction (HRI, 254–262. DOI logoGoogle Scholar
Carton, D., Olszowy, W., Wollherr, D., & Buss, M.
(2017) Socio-Contextual Constraints for Human Approach with a Mobile Robot. International Journal of Social Robotics, 9 (2), 309–327. DOI logoGoogle Scholar
Chan, L., Zhang, B. J., & Fitter, N. T.
(2021) Designing and validating expressive cozmo behaviors for accurately conveying emotions. 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 1037–1044. DOI logoGoogle Scholar
Chen, Y. F., Everett, M., Liu, M., & How, J. P.
(2017) Socially aware motion planning with deep reinforcement learning. IEEE International Conference on Intelligent Robots and Systems, 2017-Septe, 1343–1350. DOI logoGoogle Scholar
Dautenhahn, K., Nehaniv, C. L., Walters, M. L., Robins, B., Kose-Bagci, H., Mirza, N. A., & Blow, M.
(2009) KASPAR – a minimally expressive humanoid robot for human–robot interaction research. Applied Bionics and Biomechanics, 6 (3–4), 369–397. DOI logoGoogle Scholar
Di Cesare, G., De Stefani, E., Gentilucci, M., & De Marco, D.
(2017) Vitality Forms Expressed by Others Modulate Our Own Motor Response: A Kinematic Study. Frontiers in Human Neuroscience, 11 1, 565. DOI logoGoogle Scholar
Drumm, P.
(2012) Kohler, W. In R. W. Rieber (Ed.), Encyclopedia of the history of psychological theories (pp. 610–612). Springer US. DOI logoGoogle Scholar
Fischer, K., Jensen, L. C., Suvei, S. D., & Bodenhagen, L.
(2016) Between legibility and contact: The role of gaze in robot approach. 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, 646–651. DOI logoGoogle Scholar
Gil, Ó., Garrell, A., & Sanfeliu, A.
(2021) Social robot navigation tasks: Combining machine learning techniques and social force model. Sensors, 21 (21). DOI logoGoogle Scholar
Gobl, C., & Ní Chasaide, A.
(2003) The role of voice quality in communicating emotion, mood and attitude. Speech Communication, 40 (1–2), 189–212. DOI logoGoogle Scholar
Guillaume, L., Aubergé, V., Magnani, R., Aman, F., Cottier, C., Sasa, Y., Wolf, C., Nebout, F., Neverova, N., Bonnefond, N., Negre, A., Tsvetanova, L., & Girard-Rivier, M.
(2015) Hri in an ecological dynamic experiment: The gee corpus based approach for the emox robot. 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO), 1–6. DOI logoGoogle Scholar
Hall, E. T., Birdwhistell, R. L., Bock, B., Bohannan, P., Diebold, A. R., Durbin, M., Edmonson, M. S., Fischer, J. L., Hymes, D., Kimball, S. T., Barre, W. L., Frank Lynch, S. J., McClellan, J. E., Marshall, D. S., Milner, G. B., Sarles, H. B., Trager, G. L., & Vayda, A. P.
(1968) Proxemics [and comments and replies]. Current Anthropology, 9 (2/3), 83–108. [URL]. DOI logo
Hebesberger, D., Koertner, T., Gisinger, C., & Pripfl, J.
(2017) A long-term autonomous robot at a Care hospital: A mixed methods study on social acceptance and experiences of staff and older adults. International Journal of Social Robotics, 9 1. DOI logoGoogle Scholar
Honig, S., & Oron-Gilad, T.
(2020) Comparing laboratory user studies and video-enhancedweb surveys for eliciting user gestures in human–robot interactions. ACM/IEEE International Conference on Human–Robot Interaction, 248–250. DOI logoGoogle Scholar
Honour, A., Banisetty, S. B., & Feil-Seifer, D.
(2021) Perceived Social Intelligence as Evaluation of Socially Navigation. Companion of the 2021 ACM/IEEE International Conference on Human–Robot Interaction, 519–523. DOI logoGoogle Scholar
Irfan, B., Kennedy, J., Lemaignan, S., Papadopoulos, F., Senft, E., & Belpaeme, T.
(2018) Social Psychology and Human–Robot Interaction: An Uneasy Marriage. Companion of the 2018 ACM/IEEE International Conference on Human–Robot Interaction – HRI ’18, 13–20. DOI logoGoogle Scholar
Kamezaki, M., Kobayashi, A., Yokoyama, Y., Yanagawa, H., Shrestha, M., & Sugano, S.
(2019) A Preliminary Study of Interactive Navigation Framework with Situation-Adaptive Multimodal Inducement: Pass-By Scenario. International Journal of Social Robotics. DOI logoGoogle Scholar
Khambhaita, H., & Alami, R.
(2020) Viewing Robot Navigation in Human Environment as a Cooperative Activity. Springer, Cham. DOI logoGoogle Scholar
Knight, H., Thielstrom, R., & Simmons, R.
(2016) Expressive path shape (Swagger): Simple features that illustrate a robot’s attitude toward its goal in real time. IEEE International Conference on Intelligent Robots and Systems, 2016-Novem, 1475–1482. DOI logoGoogle Scholar
Kruse, T., Pandey, A. K., Alami, R., & Kirsch, A.
(2013) Human–Aware Robot Navigation: A Survey. Robotics and Autonomous Systems, 61 (12), pp.1726–1743. [URL]. DOI logo
Matsumoto, M.
(2021) Fragile Robot: The Fragility of Robots Induces User Attachment to Robots. International Journal of Mechanical Engineering and Robotics Research, 10 (10), 536–541. DOI logoGoogle Scholar
Mavrogiannis, C., Hutchinson, A. M., MacDonald, J., Alves-Oliveira, P., & Knepper, R. A.
(2019) Effects of Distinct Robot Navigation Strategies on Human Behavior in a Crowded Environment. ACM/IEEE International Conference on Human–Robot Interaction, 2019-March (March), 421–430. DOI logoGoogle Scholar
Mavrogiannis, C. I., Baldini, F., Wang, A., Zhao, D., Trautman, P., Steinfeld, A., & Oh, J.
(2021) Core challenges of social robot navigation: A survey. CoRR, abs/2103.05668. [URL]
McGinn, C., & Torre, I.
(2019) Can you tell the robot by the voice? an exploratory study on the role of voice in the perception of robots. Proceedings of the 14th ACM/IEEE International Conference on Human–Robot Interaction, 211–221. DOI logoGoogle Scholar
McGurk, H., & MacDonald, J.
(1976) Hearing lips and seeing voices. Nature, 264 (5588), 746–748. DOI logoGoogle Scholar
Menne, I. M., & Schwab, F.
(2018) Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot. International Journal of Social Robotics, 10 (2), 199–209. DOI logoGoogle Scholar
Moon, A., Parker, C. A. C., Croft, E. A., & Van der Loos, H. F. M.
(2013) Design and Impact of Hesitation Gestures during Human–Robot Resource Conflicts. Journal of Human–Robot Interaction, 2 (3). DOI logoGoogle Scholar
Mutlu, B., & Forlizzi, J.
(2008) Robots in organizations. Proceedings of the 3rd international conference on Human robot interaction – HRI 08, (May 2014), 287. DOI logoGoogle Scholar
Nomura, T., Suzuki, T., Kanda, T., & Kato, K.
(2006) Measurement of negative attitudes toward robots. Interaction Studies, 7 (3), 437–454. DOI logoGoogle Scholar
Ramirez, O. A., Khambhaita, H., Chatila, R., Chetouani, M., & Alami, R.
(2016) Robots learning how and where to approach people. 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, 1 1, 347–353. DOI logoGoogle Scholar
Reinhardt, J., Prasch, L., & Bengler, K.
(2021) Back-off. ACM Transactions on Human–Robot Interaction, 10 (3), 1–25. DOI logo removed for double-blind review (2017).Google Scholar
Rios-Martinez, J., Spalanzani, A., & Laugier, C.
(2015) From Proxemics Theory to Socially-Aware Navigation: A Survey. International Journal of Social Robotics, 7 (2), 137–153. DOI logoGoogle Scholar
Robair mobile robot, designed and built by fabmstic, grenoble
Accessed: 2021-07-19]. (n.d.).
Robinson, F. A., Velonaki, M., & Bown, O.
(2021) Smooth operator: Tuning robot perception through artificial movement sound. ACM/IEEE International Conference on Human–Robot Interaction, 53–62. DOI logoGoogle Scholar
Rosenthal-von der Pütten, A. M., Schulte, F. P., Eimler, S. C., Sobieraj, S., Hoffmann, L., Maderwald, S., Brand, M., & Kramer, N. C.
(2014) Investigations on empathy towards humans and robots using fMRI. Computers in Human Behavior, 33 1, 201–212. DOI logoGoogle Scholar
Saerbeck, M., & Bartneck, C.
(2010) Perception of affect elicited by robot motion, 53–60. DOI logoGoogle Scholar
Saldien, J., Vanderborght, B., Goris, K., Van Damme, M., & Lefeber, D.
(2014) A Motion System for Social and Animated Robots. International Journal of Advanced Robotic Systems, 11 (5), 72. DOI logoGoogle Scholar
Sasa, Y., & Aubergé, V.
(2016) Perceived isolation and elderly boundaries in eee (emoz elder-ly expressions) corpus: Appeal to communication dynamics with a socio-affectively gluing robot in a smart home. Gerontechnology, 15 1.Google Scholar
Sasa, Y., & Auberge, V.
(2017) SASI: perspectives for a socio-affectively intelligent HRI dialog system. 1st Workshop on “Behavior, Emotion and Representation: Building Blocks of Interaction”. [URL]
Savery, R., Rose, R., & Weinberg, G.
(2019) Establishing human–robot trust through music-driven robotic emotion prosody and gesture. 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 1–7. DOI logoGoogle Scholar
Savery, R., Zahray, L., & Weinberg, G.
(2021) Emotional musical prosody for the enhancement of trust: Audio design for robotic arm communication. Paladyn, Journal of Behavioral Robotics, 12 (1), 454–467. DOI logoGoogle Scholar
Scales, P., Aycard, O., & Aubergé, V.
(2020) Studying navigation as a form of interaction: A design approach for social robot navigation methods. 2020 IEEE International Conference on Robotics and Automation (ICRA), 6965–6972. DOI logoGoogle Scholar
Schulz, T., Holthaus, P., Amirabdollahian, F., Koay, K. L., Torresen, J., & Herstad, J.
(2020) Differences of Human Perceptions of a Robot Moving using Linear or Slow in, Slow out Velocity Profiles When Performing a Cleaning Task. 2019 28th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2019. DOI logoGoogle Scholar
Sharpe, D.
(2015) Your chi-square test is statistically significant: Now what? Practical Assessment, Research and Evaluation, 20 1, 1–10.Google Scholar
Shiomi, M., Zanlungo, F., Hayashi, K., & Kanda, T.
(2014) Towards a Socially Acceptable Collision Avoidance for a Mobile Robot Navigating Among Pedestrians Using a Pedestrian Model. International Journal of Social Robotics, 6 (3), 443–455. DOI logoGoogle Scholar
Sorrentino, A., Khalid, O., Coviello, L., Cavallo, F., & Fiorini, L.
(2021) Modeling human-like robot personalities as a key to foster socially aware navigation. 2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021, 95–101. DOI logoGoogle Scholar
Takenaka, H.
(2005) Loss experience and rebirth of elderly people.Google Scholar
Tanaka, K.
(1997) Geratology Isagoge.Google Scholar
Tennent, H., Moore, D., Jung, M., & Ju, W.
(2017) Good vibrations: How consequential sounds affect perception of robotic arms. RO-MAN 2017 – 26th IEEE International Symposium on Robot and Human Interactive Communication, 2017-January, 928–935. DOI logoGoogle Scholar
Torre, I., Linard, A., Steen, A., Tumova, J., & Leite, I.
(2021) Should robots chicken? How anthropomorphism and perceived autonomy influence trajectories in a game-theoretic problem. ACM/IEEE International Conference on Human–Robot Interaction, 370–379. DOI logoGoogle Scholar
Tsvetanova, L., Aubergé, V., & Sasa, Y.
(2017) Multimodal breathiness in interaction : From breathy voice quality to global breathy “body behavior quality”. Proc. of the 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots – VIHAR 2017.Google Scholar
Watanabe, K., Greenberg, Y., & Sagisaka, Y.
(2014) Sentiment analysis of color attributes derived from vowel sound impression for multimodal expression. Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, 1–5. DOI logoGoogle Scholar
Zecca, M., Endo, N., Momoki, S., Itoh, K., & Takanishi, A.
(2008) Design of the humanoid robot KOBIAN – preliminary analysis of facial and whole body emotion expression capabilities-. 2008 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008, 487–492. DOI logoGoogle Scholar
Zhou, A., & Dragan, A. D.
(2018) Cost Functions for Robot Motion Style. IEEE International Conference on Intelligent Robots and Systems, 3632–3639. DOI logoGoogle Scholar