12.12.2012 Views

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

obot emotion 275<br />

problematic. <strong>The</strong> importance of emotion in intelligent decision making is<br />

markedly demonstrated by Damasio’s (1994) studies of patients with neurological<br />

damage that impairs <strong>the</strong>ir emotional systems. Although <strong>the</strong>se patients<br />

perform normally on standardized cognitive tasks, <strong>the</strong>y are severely<br />

limited in <strong>the</strong>ir ability to make rational and intelligent decisions in <strong>the</strong>ir daily<br />

lives. For instance, <strong>the</strong>y may lose a great deal of money in an investment.<br />

Whereas healthy people would become more cautious and stop investing in<br />

it, <strong>the</strong>se emotionally impaired patients do not. <strong>The</strong>y cannot seem to learn<br />

<strong>the</strong> link between bad feelings and dangerous choices, so <strong>the</strong>y keep making<br />

<strong>the</strong> same bad choices again and again. <strong>The</strong> same pattern is repeated in <strong>the</strong>ir<br />

relationships and social interactions, resulting in loss of jobs, friends, and more<br />

(Damasio, 1994; Picard, 1997). By looking at highly functioning autistics,<br />

we can see <strong>the</strong> crucial role that emotion plays in normal relations with o<strong>the</strong>rs.<br />

<strong>The</strong>y seem to understand <strong>the</strong> emotions of o<strong>the</strong>rs like a computer, memorizing<br />

and following rules to guide <strong>the</strong>ir behavior but lacking an intuitive<br />

understanding of o<strong>the</strong>rs. <strong>The</strong>y are socially handicapped, not able to understand<br />

or interpret <strong>the</strong> social cues of o<strong>the</strong>rs to respond in a socially appropriate<br />

manner (Baron-Cohen, 1995).<br />

Emotion <strong>The</strong>ory Applied to <strong>Robot</strong>s<br />

This chapter presents a pragmatic view of <strong>the</strong> role emotion-inspired mechanisms<br />

and capabilities could play in <strong>the</strong> design of autonomous robots, especially<br />

as it is applied to human–robot interaction (HRI). Given our discussion<br />

above, many examples could be given to illustrate <strong>the</strong> variety of roles that<br />

emotion-inspired mechanisms and abilities could serve a robot that must<br />

make decisions in complex and uncertain circumstances, working ei<strong>the</strong>r alone<br />

or with o<strong>the</strong>r robots. Our interest, however, concerns how emotion-inspired<br />

mechanisms can improve <strong>the</strong> way robots function in <strong>the</strong> human environment<br />

and how such mechanisms can improve robots’ ability to work effectively<br />

in partnership with people. In general, <strong>the</strong>se two design issues (robust<br />

behavior in <strong>the</strong> real world and effective interaction with humans) are extremely<br />

important given that many real-world autonomous robot applications<br />

require robots to function as members of human–robot teams.<br />

We illustrate <strong>the</strong>se advantages with a design case study of <strong>the</strong> cognitive<br />

and emotion-inspired systems of our robot, Kismet. We have used <strong>the</strong> design<br />

of natural intelligence as a guide, where Kismet’s cognitive system enables<br />

it to figure out what to do and its emotive system helps it to do so<br />

more flexibly in complex and uncertain environments (i.e., <strong>the</strong> human environment),<br />

as well as to behave and interact with people in a socially acceptable<br />

and natural manner.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!