Article published in:
Gaze in Human-Robot Communication
Edited by Frank Broz, Hagen Lehmann, Bilge Mutlu and Yukiko Nakano
[Benjamins Current Topics 81] 2015
► pp. 131158
References

References

Ando, S.
(2004) Perception of gaze direction based on luminance ratio. Perception, 33, 1173–1184. CrossrefGoogle Scholar
Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C.L., & Lee, D.C.
(2006) Perception of robot smiles and dimensions for human-robot interaction design. Paper Presented at the International Symposium on Robot and Human Interactive Communication , Hatfield, Hertfordshire. Crossref
Chikaraishi, T., Nakamura, Y., Matumoto, Y., & Ishiguro, H.
(2008) Gaze control for a natural idling motion in an android. Paper Presented at the 13th Robotics Symposia , Kagawa, Japan. (In Japanese)
Delaunay, F., Greeff, J., & Belpaeme, T.
(2010) A study of a retro-projected robotic face and its effectiveness for gaze reading by humans. Paper Presented at the International Conference on Human-Robot Interaction , Osaka, Japan. Crossref
Doshi, A., & Trivedi, M.M.
(2009) Head and gaze dynamics in visual attention and context learning. Paper Presented at the Computer Society Conference on Computer Vision and Pattern Recognition Workshops , Miami, FL. Crossref
Holm, S.
(1979) A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics, 6(2), 65–70.Google Scholar
Honda Motor Co., Ltd
n.d.). ASIMO – The honda worldwide ASIMO site. Retrieved from http://​world​.honda​.com​/ASIMO​/index​.html (accessed October 30, 2013).
Jonides, J.
(1981) Voluntary versus automatic control over the mind’s eye’s movement. Attention and Performance, 9, 187–203.Google Scholar
Kendon, A.
(1967) Some functions of gaze direction in social interaction. Acta Psychophysica, 25, 22–63. CrossrefGoogle Scholar
Kingstone, A., Friesen, C.K., & Gazzaniga, M.S.
(2000) Reflexive joint attention depends on lateralized cortical connections. Psychological Science, 11(2), 159–166. CrossrefGoogle Scholar
Kobayashi, H., & Kohshima, S.
(2001) Unique morphology of the human eye and its adaptive meaning: Comparative studies on external morphology of the primate eye. Journal of Human Evolution, 40, 419–435. CrossrefGoogle Scholar
Kondo, Y., Kawamura, M., Takemura, K., Takamatsu, J., & Ogasawara, T.
(2011) Gaze motion planning for android robot. Paper Presented at the International Conference on Human-Robot Interaction , Lausanne, Switzerland. Crossref
Langton, S.R.H., & Bruce, V.
(1999) Reflexive visual orienting in response to the social attention of others. Visual Cognition, 6(5), 541–567. CrossrefGoogle Scholar
Levoy, M., & Hanrahan, P.
(1996) Light field rendering. Paper Presented at the International Conference and Exhibition on Computer Graphics and Interactive Techniques , New Orleans, LA. Crossref
Misawa, K., Ishiguro, Y., & Rekimoto, J.
(2012) Ma petite cherie: What are you looking at?: A small telepresence system to support remote collaborative work for intimate communication. Paper Presented at the Augmented Human Interactional Conference , Megeve, France.
Mori, M.
(1970) The uncanny valley. Energy, 7(4), 33–35.Google Scholar
Muro, A., & Sato, T.
(2005) Perception of gaze direction with faces generated by computer graphics. Technical Report of Information and Communication Engineers MVE2005-1, 105(106), 1–6. (In Japanese).Google Scholar
Mutlu, B., Forlizzi, J., & Hodgins, J.
(2006) A storytelling robot: Modeling and evaluation of human-like gaze behavior. Paper Presented at the International Conference on Humanoid Robots , Genova, Italy.
Nakagawa, S.
(2012) Robotics design. Tokyo: Bijutsu Shuppan-Sha. (In Japanese).Google Scholar
National Institute of Advanced Industrial Science and Technology
(2010) Successful development of a robot with appearance and performance similar to humans. Retrieved from http://​www​.aist​.go​.jp​/aist​_j​/press​_release​/pr2010​/pr20100915​/pr20100915​.html (accessed October 30, 2013, in Japanese).Google Scholar
NEC Co
n.d.). Comunication robot PaPeRo: Product | NEC. Retrieved from http://​jpn​.nec​.com​/robot/ (accessed October 30, 2013, in Japanese).
Sate, K., Kodama, S., & Azuma, S.
(2005) A report on the effects of virtual character eye expression: Proposal of an interaction model based on psychophysical responses. Technical Report of Information and Communication Engineers, 105(165), 117–122. (In Japanese).Google Scholar
Sidner, C.L., Lee, C., Kidd, C.D. & Rich, C.
(2005) Explorations in engagement for humans and robots. Artificial Intelligence, 166, 140–164. CrossrefGoogle Scholar
Sonoyama, T.
(2007) Introduction to robot design. Tokyo: Mainichi Communications. (In Japanese).Google Scholar
Thurstone, L.L.
(1927) Psychophysical analysis. The American Journal of Psychology, 38(3), 368–389. CrossrefGoogle Scholar
Tsukida, K., & Gupta, M.R.
(2011) How to analyze paired comparison data (No. UWEETR -2011-0004). Washington University, Seattle: Department of Electrical Engineering.Google Scholar
Vecera, S.P., & Johnson, M.H.
(1995) Gaze detection and the cortical processing of faces: Evidence from infants and adults. Visual Cognition, 2(1), 59–87. CrossrefGoogle Scholar
Vstone Co., Ltd
n.d.). PRODUCTS | Vstone Co., Ltd. Retrieved from http://​www​.vstone​.co​.jp​/english​/products​.html (accessed October 30, 2013).
Yamazaki, Y., Dong, F., Masuda, Y., Uehara, Y., Kormushev, P., Vu, H.A., Le, P.Q., & Hirota, K.
(2007) Fuzzy inference based mentality estimation for eye robot agent. Paper Presented at the 23rd Fuzzy System Symposium , Nagoya, Japan. (In Japanese).