Article published In:
Interaction Studies
Vol. 16:2 (2015) ► pp.219248
Admoni, H., Datsikas, C., & Scassellati, B
(2014) Speech and gaze conflicts in collaborative human-robot interactions. In P. Bello, M. Guarini, M. McShane, & B. Scassellati (eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society (CogSci 2014) , 104–109.
Cabibihan, J.J., Javed, H., Ang, M. Jr., & Aljunied, S.M
(2013) Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. International Journal of Social Robotics, 51, 593–618. DOI logoGoogle Scholar
Bainbridge, W.A., Hart, J.W., Kim, E.S., & Scassellati, B
(2010) The benefits of interactions with physically present robots over video-displayed agents. International Journal of Social Robotics, 31, 41–52. DOI logoGoogle Scholar
Baron-Cohen, S
(1995) Mindblindness: An essay on autism and the theory of mind. Boston: MIT Press/Bradford Books. DOI logoGoogle Scholar
Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I
(2001a) The “Reading the Mind in the Eyes” Test revised version: a study with normal adults, and adults with Asperger syndrome or high-functioning autism. Journal of Child Psychology and Psychiatry, 421, ­241–251. DOI logoGoogle Scholar
Baron-Cohen, S., Wheelwright, S., Skinner, R., Martin, J., & Clubley, E
(2001b) The autism-spectrum quotient (AQ): evidence from Asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. Journal of Autism and Developmental Disorders, 311, 5–17. DOI logoGoogle Scholar
Blake, R., & Shiffrar, M
(2007) Perception of human motion. Annual Review of Psychology, 581, 47–73. DOI logoGoogle Scholar
Blake, R., Turner, L.M., Smoski, M.J., Pozdol, S.L., & Stone, W.L
(2003) Visual recognition of biological motion is impaired in children with autism. Psychological Science, 141, 151–157. DOI logoGoogle Scholar
Chaminade, T., Rosset, D., Da Fonseca, D., Nazarian, B., Lutscher, E., Cheng, G., & Deruelle, C
(2012) How do we think machines think? Not as intentional agents! An fMRI study of alleged competition with an artificial intelligence. Frontiers in Human Neuroscience, 6, 103. DOI logoGoogle Scholar
Cohen, J
(1988) Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
(1992) A power primer. Psychological Bulletin, 1121, 155–159. DOI logoGoogle Scholar
Cousineau, D
(2005) Confidence intervals in within-subject designs: A simpler solution to Loftus and Masson’ s method. Tutorials in Quantitative Methods for Psychology, 11, 42–45. DOI logoGoogle Scholar
Driver, J., Davis, G., Ricciardelli, P., Kidd, P., Maxwell, E., & Baron-Cohen, S
(1999) Gaze perception triggers reflexive visuospatial orienting. Visual Cognition, 61, 509–540. DOI logoGoogle Scholar
DSM-5 (2013) (5th ed.). American Psychiatric Association. DOI logoGoogle Scholar
Ehrlich, S., Wykowska, A., Ramirez-Amaro, K., & Cheng, G
(2014) When to engage in interaction – and how? EEG-based enhancement of robot’s ability to sense social signals in HRI. In IEEE-RAS International Conference on Humanoid Robots , Madrid, Spain 2014 DOI: DOI logo
Farroni, T., Massaccessi, S., Pividori, D., Simion, F., & Johnson, M.H
(2004) Gaze Following in Newborns. Infancy, 51. DOI logoGoogle Scholar
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A
(2007) G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 391, 175–191. DOI logoGoogle Scholar
Friesen, C.K., & Kingstone, A
(1998) The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychonomic Bulletin & Review, 51, 490–495. DOI logoGoogle Scholar
Frith, C.D., & Frith, U
(2008) Implicit and explicit processes in social cognition. Neuron, 601, 503–510. DOI logoGoogle Scholar
Gonzalez-Pacheco, V., Malfaz, M., Fernandez, F., & Salichs, M.A
(2013) Teaching human poses interactively to a social robot. Sensors (Basel), 131, 12406–12430. DOI logoGoogle Scholar
Grassmann, S., & Tomasello, M
(2010) Young children follow pointing over words in interpreting acts of reference. Developmental Science, 131, 252–263. DOI logoGoogle Scholar
Gray, H.M., Gray, K., & Wegner, D
(2007) Dimensions of mind perception. Science, 3151. DOI logoGoogle Scholar
Grossman, E., &Blake, R
(2002) Brain areas active during visual perception of biological motion. Neuron, 351, 1157–1165. DOI logoGoogle Scholar
Haslam, N., Bain, P., Douge, L., Lee, M., & Bastian, B
(2005) More human than you: attributing humanness to self and others. Journal of Personality and Social Psychology, 891, 937–950. DOI logoGoogle Scholar
Johansson, G
(1973) Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 141, 201–211. DOI logoGoogle Scholar
Johnson, M.H
(2006) Biological motion: A perceptual life detector? Current Biology, 161, R376-R377. DOI logoGoogle Scholar
Jung, Y. & Lee, K.M
(2004) Effects of physical embodiment on social presence of social robots. Proceedings of PRESENCE, 80–87.Google Scholar
Kagan, J
(2004) The uniquely human in human nature. Daedalus, 1331, 77–88. DOI logoGoogle Scholar
Kozima, H., Nakagawa, C., & Yasuda, Y
(2005) Interactive robots for communication-care: A case-study in autism therapy. IEEE International Workshop on Robots and Human Interactive Communications (RO-MAN), 341–346. DOI logoGoogle Scholar
Kuhlmeier, V.A., Troje, N.F., & Lee, V
(2010) Young infants detect the direction of biological motion in point-light displays. Infancy, 151, 83–93. DOI logoGoogle Scholar
Lee, K.M., Jung, Y., Kim, J., & Kim, S.R
(2006) Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people’s loneliness in human-robot ineracton. International Jouranl in Human-­Computer Studies, 641, 962–973. DOI logoGoogle Scholar
Li, H., Cabibihan, J.J., & Tan, Y.K
(2011) Towards an effectivee design of social robots. International Journal of Social Robotics, 31, 333–335. DOI logoGoogle Scholar
Metta, G., Sandini, G., Vernon, D., Natale, L., & Nori, F (2008) The iCub humanoid robot: an open platform for research in embodied cognition. Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems , Gaithersburg, Maryland, 50–56. DOI logo
MacDorman, K.F., & Ishiguro, H
Moll, H., Carpenter, M., & Tomasello, M
(2007) Fourteen-month-olds know what others experience only in joint engagement. Developmental Science, 101, 826–835. DOI logoGoogle Scholar
Mori, M
(1970) The uncanny valley. Energy, 71, 33–35.Google Scholar
Mutlu, B., Forlizzi, J., & Hodgins, J
(2006) A storytelling robot: Modeling and evaluation of human-like gaze behavior. IEEE-RAS International Conference on Humanoid Robots , 518–523. DOI logo
Mutlu, B., Yamaoka, F., Kanda, T., Ishiguro, H., & Hagita, N
(2009) Nonverbal leakage in robots: communication of intentions through seemingly unintentional behavior. Proceedings of the 4th ACM/IEEE international conference on Human-robot interaction (HRI ‘09) . ACM, New York, NY, USA, 69–76. DOI logo
Pfeiffer, U.J., Timmermans, B., Bente, G., Vogeley, K., & Schilbach, L
(2011) A non-verbal ­Turing test: differentiating mind from machine in gaze-based social interaction. PLoS One, 6(11), e27591. DOI logoGoogle Scholar
Scassellati, B., Admoni, H., & Matarić, M
(2012) Robots for use in autism research. Annual Review of Biomedical Engineering, 141, 275–294. DOI logoGoogle Scholar
Schilbach, L., Timmermans, B., Reddy, V., Costall, A., Bente, G., Schlicht, T., & Vogeley, K
(2013) Toward a second-person neuroscience. Behavioral and Brain Sciences, 361, 393–414. DOI logoGoogle Scholar
Schuwerk, T., Vuori, M., & Sodian, B
(2014) Implicit and explicit Theory of Mind reasoning in autism spectrum disorders: The impact of experience. Autism. DOI logoGoogle Scholar
Senju, A., & Johnson, M.H
(2009) The eye contact effect: Mechanisms and development. Trends in Cognitive Sciences, 131, 127–134. DOI logoGoogle Scholar
Soni, B., & Hingston, P
(2008) Bots trained to play like a human are more fun. Neural Networks, 2008. IJCNN 2008. IEEE World Congress on Computational Intelligence, 363–369. DOI logoGoogle Scholar
Takahashi, H. et al.
(2014) Different impressions of other agents obtained through social interaction uniquely modulate dorsal and ventral pathway activities in the social human brain. Cortex, 581, 289–300. DOI logoGoogle Scholar
Thornton, I.M., & Vuong, Q.C
(2004) Incidental processing of biological motion. Current Biology,141, 1084–1089. DOI logoGoogle Scholar
Turing, A
(1950) Computing machinery and intelligence. Mind, 591, 433–460. DOI logoGoogle Scholar
Wang, Y., & Hamilton, A.F
(2014) Why does gaze enhance mimicry? Placing gaze-mimicry effects in relation to other gaze phenomena. Quarterly Journal of Experimental Psychology, 671, 747–762. DOI logoGoogle Scholar
Wiese, E., Wykowska, A., & Müller, H.J
(2014) What we observe is biased by what other people tell us: Beliefs about the reliability of gaze behavior modulate attentional orienting to gaze cues. PloS One, 9(4), e94529. DOI logoGoogle Scholar
Wiese, E., Wykowska, A., Zwickel, J., & Muller, H.J (2012) I see what you mean: How attentional selection is shaped by ascribing intentions to others. PLoS One, 7(9), e45391. DOI logoGoogle Scholar
Wimmer, H., & Perner, J
(1983) Beliefs about beliefs: Representation and constraining function of wrong beliefs in young children’s understanding of deception. Cognition, 131, 103–128. DOI logoGoogle Scholar
Wykowska A, Wiese. E., Prosser, A., & Müller H.J
(2014) Beliefs about the minds of others influence how we process sensory information. PLoS One, 9(4). DOI logoGoogle Scholar
Wykowska, A., Kajopoulos, J., Obando-Leitón, M., Cabibihan, J., Chauhan, S., & Cheng, G
(2015) Humans are well tuned to detecting agents among non-agents. International ­Journal of Social Robotics. DOI logoGoogle Scholar
Cited by

Cited by 18 other publications

Chevalier, Pauline, Kyveli Kompatsiari, Francesca Ciardo & Agnieszka Wykowska
2020. Examining joint attention with the use of humanoid robots-A new approach to study fundamental mechanisms of social cognition. Psychonomic Bulletin & Review 27:2  pp. 217 ff. DOI logo
Ciardo, F., D. De Tommaso & A. Wykowska
2022. Human-like behavioral variability blurs the distinction between a human and a machine in a nonverbal Turing test. Science Robotics 7:68 DOI logo
El-Muhammady, Muhammad Faisal, Sarah Afiqah Mohd Zabidi, Hazlina Md. Yusof, Mohammad Ariff Rashidan, Shahrul Na’im Sidek & Aimi Shazwani Ghazali
2023. Initial Response in HRI: A Pilot Study on Autism Spectrum Disorder Children Interacting with a Humanoid QTrobot. In Robot Intelligence Technology and Applications 7 [Lecture Notes in Networks and Systems, 642],  pp. 393 ff. DOI logo
Grynszpan, Ouriel, Jacqueline Nadel, Jean-Claude Martin & Philippe Fossati
2017. The awareness of joint attention. Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 18:2  pp. 234 ff. DOI logo
Kompatsiari, K., J. Perez-Osorio, D. De Tommaso, G. Metta & A. Wykowska
2018. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),  pp. 3403 ff. DOI logo
Kompatsiari, Kyveli, Francesco Bossi & Agnieszka Wykowska
2021. Eye contact during joint attention with a humanoid robot modulates oscillatory brain activity. Social Cognitive and Affective Neuroscience 16:4  pp. 383 ff. DOI logo
Kompatsiari, Kyveli, Francesca Ciardo, Vadim Tikhanoff, Giorgio Metta & Agnieszka Wykowska
2018. On the role of eye contact in gaze cueing. Scientific Reports 8:1 DOI logo
2021. It’s in the Eyes: The Engaging Role of Eye Contact in HRI. International Journal of Social Robotics 13:3  pp. 525 ff. DOI logo
Kompatsiari, Kyveli, Vadim Tikhanoff, Francesca Ciardo, Giorgio Metta & Agnieszka Wykowska
2017. The Importance of Mutual Gaze in Human-Robot Interaction. In Social Robotics [Lecture Notes in Computer Science, 10652],  pp. 443 ff. DOI logo
Marchesi, Serena, Davide Ghiglino, Francesca Ciardo, Jairo Perez-Osorio, Ebru Baykara & Agnieszka Wykowska
2019. Do We Adopt the Intentional Stance Toward Humanoid Robots?. Frontiers in Psychology 10 DOI logo
Morgan, Emma J., Daniel T. Smith & Megan Freeth
2023. Gaze cueing, mental States, and the effect of autistic traits. Attention, Perception, & Psychophysics 85:2  pp. 485 ff. DOI logo
Natale, Lorenzo, Chiara Bartolozzi, Daniele Pucci, Agnieszka Wykowska & Giorgio Metta
2017. iCub: The not-yet-finished story of building a robot child. Science Robotics 2:13 DOI logo
Perez-Osorio, J., D. De Tommaso, E. Baykara & A. Wykowska
2018. 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN),  pp. 152 ff. DOI logo
Perez-Osorio, Jairo & Agnieszka Wykowska
2019. Adopting the Intentional Stance Towards Humanoid Robots. In Wording Robotics [Springer Tracts in Advanced Robotics, 130],  pp. 119 ff. DOI logo
Schellen, Elef, Francesco Bossi & Agnieszka Wykowska
2021. Robot Gaze Behavior Affects Honesty in Human-Robot Interaction. Frontiers in Artificial Intelligence 4 DOI logo
Willemse, Cesco, Abdulaziz Abubshait & Agnieszka Wykowska
2022. Motor behaviour mimics the gaze response in establishing joint attention, but is moderated by individual differences in adopting the intentional stance towards a robot avatar. Visual Cognition 30:1-2  pp. 42 ff. DOI logo
Willemse, Cesco, Serena Marchesi & Agnieszka Wykowska
2018. Robot Faces that Follow Gaze Facilitate Attentional Engagement and Increase Their Likeability. Frontiers in Psychology 9 DOI logo
Wykowska, Agnieszka, Thierry Chaminade & Gordon Cheng
2016. Embodied artificial agents for understanding human social cognition. Philosophical Transactions of the Royal Society B: Biological Sciences 371:1693  pp. 20150375 ff. DOI logo

This list is based on CrossRef data as of 21 march 2023. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.