12.07.2015 Views

View - ResearchGate

View - ResearchGate

View - ResearchGate

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

76 Socially Intelligent Agents2. www.daimi.au.dk/∼chili/elektra.html.3. One RCX controls the emotional state of the robot on the grounds of tactile stimulation applied tothe feet, while the other controls its facial displays.4. Visit www.daimi.au.dk/∼chili/feelix/feelix home.htm for a video of Feelix’s basic expressions.5. I have also built some demos where Feelix shows chimerical expressions that combine an emotionin the upper part of the face—eyebrows—and a different one in the lower part—mouth.6. Tests were performed by 86 subjects—41 children, aged 9–10, and 45 adults, aged 15–57. Allchildren and most adults were Danish. Adults were university students and staff unfamiliar with the project,and visitors to the lab.7. I am grateful to Mark Scheeff for pointing me to this idea, and to Hideki Kozima for helping metrack it down. Additional information can be found at www.arclight.net/∼pdb/glimpses/valley.html.8. According to Irenäus Eibl-Eibesfeldt, the baby-scheme is an “innate” response to treat as an infantevery object showing certain features present in children. See for example I. Eibl-Eibesfeldt, El hombrepreprogramado, Alianza Universidad, Madrid, 1983 (4th edition); original German title: Der vorprogrammierteMensch, Verlag Fritz Molden, Wien-München-Zürich, 1973.9. As an example, the speed at which the expression is formed was perceived as particularly significantin sadness and surprise, especially in the motion of eyebrows.References[1] C. Breazeal. Designing Sociable Machines: Lessons Learned. This volume.[2] C. Breazeal and A. Forrest. Schmoozing with Robots: Exploring the Boundary of theOriginal Wireless Network. In K. Cox, B. Gorayska, and J. Marsh, editors, Proc. 3rd.International Cognitive Technology Conference, pages 375–390. San Francisco, CA, August11–14, 1999.[3] L.D. Cañamero and J. Fredslund. I Show You How I Like You—Can You Read It in myFace? IEEE Trans. on Systems, Man, and Cybernetics: Part A, 31(5): 454–459, 2001.[4] P. Ekman. An Argument for Basic Emotions. Cognition and Emotion, 6(3/4): 169–200,1992.[5] P. Ekman. Facial Expressions. In T. Dalgleish and M. Power, editors, Handbook of Cognitionand Emotion, pages 301–320. John Wiley & Sons, Sussex, UK, 1999.[6] P. Ekman and W.V. Friesen. Facial Action Coding System. Consulting Psychology Press,Palo Alto, CA, 1976.[7] D. Kirsch. The Affective Tigger: A Study on the Construction of an Emotionally ReactiveToy. S.M. thesis, Department of Media Arts and Sciences, Massachusetts Institute ofTechnology, Cambridge, MA, 1999.[8] B. Reeves and C. Nass. The Media Equation. How People Treat Computers, Television,and New Media Like Real People and Places. Cambridge University Press/CSLI Publications,New York, 1996.[9] J. Reichard. Robots: Fact, Fiction + Prediction. Thames & Hudson Ltd., London, 1978.[10] M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe and R. Tow. Experiences with Sparky, aSocial Robot. This volume.[11] S. Thrun. Spontaneous, Short-term Interaction with Mobile Robots in Public Places. InProc. IEEE Intl. Conf. on Robotics and Automation. Detroit, Michigan, May 10–15, 1999.[12] S.S. Tomkins. Affect Theory. In K.R. Scherer and P. Ekman, editors, Approaches to Emotion,pages 163–195. Lawrence Erlbaum, Hillsdale, NJ, 1984.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!