‘Smart’ devices are becoming increasingly ubiquitous. While these sophisticated machines are useful for various purposes,
they sometimes evoke feelings of eeriness or discomfort that constitute uncanniness, a much-discussed phenomenon in robotics research. Adult
participants (N = 115) rated the uncanniness of a hypothetical future smart speaker that was described as possessing the
mental capacities for experience, agency, neither, or both. The novel condition prompting participants to attribute both agency and
experience to the speaker filled an important theoretical gap in the literature. Consistent with the mind perception hypothesis of
uncanniness (MPH; Gray & Wegner, 2012), participants in the with-experience condition rated
the device significantly higher in uncanniness than those in the control condition and the with-agency condition. Participants in the
with-both (experience and agency) condition also rated the device higher in uncanniness than those in the control condition and the
with-agency condition, although this latter difference only approached statistical significance.
SM1.Paragraph describing smart speakers and listing their current features; accompanying images of currently popular smart speakers (presented to all participants)
SM2.Items presented to participants (dependent variables)
Appel, M., Izydorczyk, D., Weber, S., Mara, M., & Lischetzke, T. (2020). The uncanny of mind in a machine: Humanoid robots as tools, agents, and experiencers. Computers in Human Behavior, 1021, 274–286.
Apple. (2018, January23). HomePod arrives February 9, available to order this friday [Press release]. Retrieved from [URL]
Brink, K. A., Gray, K., & Wellman, H. M. (2019). Creepiness creeps in: Uncanny valley feelings are acquired in childhood. Child Development, 901, 1202–1214.
Broadbent, E., Kumar, V., Li, X., Sollers, J., III, Stafford, R. Q., MacDonald, B. A., & Wegner, D. M. (2013). Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality. PloS One, 81, e72589.
Broadbent, E., Kuo, I. H., Lee, Y. I., Rabindran, J., Kerse, N., Stafford, R., & MacDonald, B. A. (2010). Attitudes and reactions to a healthcare robot. Telemedicine and e-Health, 161, 608–613.
Ciechanowski, L., Przegalinska, A., Magnuski, M., & Gloor, P. (2019). In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 921, 539–548.
Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 661, 93–99.
Creed, C., & Beale, R. (2012). User interactions with an affective nutritional coach. Interacting with Computers, 241, 339–350.
Creed, C., Beale, R., & Cowan, B. (2015). The impact of an embodied agent’s emotional expressions over multiple interactions. Interacting with Computers, 271, 172–188.
Deng, E., Mutlu, B., & Mataric, M. J. (2019). Embodiment in socially interactive robots. Foundations and Trends in Robotics, 71, 251–356.
Ferrey, A. E., Burleigh, T. J., & Fenske, M. J. (2015). Stimulus-category competition, inhibition, and affective devaluation: A novel account of the uncanny valley. Frontiers in Psychology, 61, 249.
Gray, H. M., Gray, K., & Wegner, D. M. (2007). Dimensions of mind perception. Science, 3151, 619.
Gray, K., Jenkins, A. C., Heberlein, A. S., & Wegner, D. M. (2011). Distortions of mind perception in psychopathology. Proceedings of the National Academy of Sciences, 1081, 477–479.
Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 1251, 125–130.
Gray, K., Young, L., & Waytz, A. (2012). Mind perception is the essence of morality. Psychological Inquiry, 231, 101–124.
Ho, C. C., MacDorman, K. F., & Pramono, Z. D. (2008, March). Human emotion and the uncanny valley: A GLM, MDS, and Isomap analysis of robot video ratings. In Proceedings of the 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 169–176). Amsterdam, the Netherlands.
Kätsyri, J., Förger, K., Mäkäräinen, M., & Takala, T. (2015). A review of empirical evidence on different uncanny valley hypotheses: Support for perceptual mismatch as one road to the valley of eeriness. Frontiers in Psychology, 61, 390.
Kawabe, T., Sasaki, K., Ihaya, K., & Yamada, Y. (2017). When categorization-based stranger avoidance explains the uncanny valley: A comment on MacDorman and Chattopadhyay (2016). Cognition, 1611, 129–131.
Knobe, J., & Prinz, J. (2008). Intuitions about consciousness: Experimental studies. Phenomenology and the Cognitive Sciences, 71, 67–83.
Kupferberg, A., Glasauer, S., Huber, M., Rickert, M., Knoll, A., & Brandt, T. (2011). Biological movement increases acceptance of humanoid robots as human partners in motor interaction. AI & Society, 261, 339–345.
Lau, J., Zimmerman, B., & Schaub, F. (2018). Alexa, are you listening? Privacy perceptions, concerns and privacy-seeking behaviors with smart speakers. Proceedings of the ACM on Human-Computer Interaction, 21, 102.
Liu, B., & Sundar, S. S. (2018). Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior, and Social Networking, 211, 625–636.
Lynch, J. G. Jr., Bradlow, E. T., Huber, J. C., & Lehmann, D. R. (2015). Reflections on the replication corner: In praise of conceptual replications. International Journal of Research in Marketing, 321, 333–342.
MacDorman, K. F., & Chattopadhyay, D. (2016). Reducing consistency in human realism increases the uncanny valley effect; increasing category uncertainty does not. Cognition, 1461, 190–205.
MacDorman, K. F., Green, R. D., Ho, C. C., & Koch, C. T. (2009). Too real for comfort? Uncanny responses to computer generated faces. Computers in Human Behavior, 251, 695–710.
MacDorman, K. F., Vasudevan, S. K., & Ho, C. C. (2009). Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI & Society, 231, 485–510.
Mitchell, W. J., Szerszen, K. A. Sr., Lu, A. S., Schermerhorn, P. W., Scheutz, M., & MacDorman, K. F. (2011). A mismatch in the human realism of face and voice produces an uncanny valley. i-Perception, 21, 10–12.
Moore, R. K. (2012). A Bayesian explanation of the ‘Uncanny Valley’ effect and related psychological phenomena. Nature Scientific Reports, 21, 864.
Mori, M. (1970/2005). The uncanny valley. (K. F. MacDorman, & T. Minato, Trans.). Energy, 71, 33–35.
Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. Robotics & Automation Magazine, IEEE, 191, 98–100.
Myers, C. M., Furqan, A., & Zhu, J. (2019, May). The impact of user characteristics and preferences on performance with an unfamiliar voice user interface. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–9). Glasgow, Scotland.
NPR & Edison Research. (2018). The smart audio report, winter 2018. Retrieved from [URL]
Pollick, F. E. (2010). In search of the uncanny valley. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 401, 69–78.
R Core Team (2020). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. [URL]
Ramey, C. H. (2006). An inventory of reported characteristics for home computers, robots, and human beings: Applications for android science and the uncanny valley. In MacDorman, K. F., & Ishiguro, H. (Eds.), Proceedings of the ICCS/CogSci 2006 Long Symposium: ‘Toward Social Mechanisms of Android Science’ (pp. 21–25). Vancouver, Canada.
Seyama, J. I., & Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 161, 337–351.
Stafford, R. Q., Broadbent, E., Jayawardena, C., Unger, U., Kuo, I. H., Igic, A., Wong, R., Kerse, N., Watson, C., & MacDonald, B. A. (2010, September). Improved robot attitudes and emotions at a retirement home after meeting a robot. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication (pp. 82–87). Viareggio, Italy.
Stafford, R. Q., MacDonald, B. A., Jayawardena, C., Wegner, D. M., & Broadbent, E. (2014). Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. International Journal of Social Robotics, 61, 17–32.
Stein, J. P., & Ohler, P. (2017). Venturing into the uncanny valley of mind – The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition, 1601, 43–50.
Tharp, M., Holtzman, N. S., & Eadeh, F. R. (2017). Mind perception and individual differences: A replication and extension. Basic and Applied Social Psychology, 391, 68–73.
Waddell, K. (2017, April21). Chatbots have entered the uncanny valley. The Atlantic. Retrieved from [URL]
Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The uncanny valley: Existence and explanations. Review of General Psychology, 191, 393–407.
Wang, X., & Krumhuber, E. G. (2018). Mind perception of robots varies with their economic versus social function. Frontiers in Psychology, 91, 1230.
Waytz, A., Cacioppo, J., & Epley, N. (2010). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 51, 219–232.
Wickham, H. (2016). ggplot2: Elegant graphics for data analysis. New York: Springer-Verlag.
Wilson, M. (2018, March9). Alexa’s creepy laughter is a bigger problem than amazon admits. Fast Company. Retrieved from [URL]
Cited by (8)
Cited by eight other publications
Grundke, Andrea
2024. If machines outperform humans: status threat evoked by and willingness to interact with sophisticated machines in a work-related context*. Behaviour & Information Technology 43:7 ► pp. 1348 ff.
Grundke, Andrea, Markus Appel & Jan-Philipp Stein
2024. Aversion against machines with complex mental abilities: The role of individual differences. Computers in Human Behavior: Artificial Humans 2:2 ► pp. 100087 ff.
Guingrich, Rose E. & Michael S. A. Graziano
2024. Ascribing consciousness to artificial intelligence: human-AI interaction and its carry-over effects on human-human interaction. Frontiers in Psychology 15
MacDorman, Karl F.
2024. Does mind perception explain the uncanny valley? A meta-regression analysis and (de)humanization experiment. Computers in Human Behavior: Artificial Humans 2:1 ► pp. 100065 ff.
Grundke, Andrea, Jan-Philipp Stein & Markus Appel
2023. Improving evaluations of advanced robots by depicting them in harmful situations. Computers in Human Behavior 140 ► pp. 107565 ff.
Messingschlager, Tanja Veronika & Markus Appel
2023. Mind ascribed to AI and the appreciation of AI-generated art. New Media & Society
Carolus, Astrid & Carolin Wienrich
2022. “Imagine this smart speaker to have a body”: An analysis of the external appearances and the characteristics that people associate with voice assistants. Frontiers in Computer Science 4
Gingrich, Oliver, Alain Renaud, Evgenia Emets & David NegrÃo
2022. KIMA: Voice: The human voice as embodied presence. Virtual Creativity 12:1 ► pp. 59 ff.
This list is based on CrossRef data as of 8 january 2025. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.