12.12.2012 Views

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

240 robots<br />

some undesirable emotional states to be hard to avoid in certain contexts if<br />

<strong>the</strong> machines have affective control mechanism that interact in complex<br />

ways.<br />

Detailed studies of design and niche space, in which <strong>the</strong> relationships<br />

between classes of designs and classes of niches for <strong>the</strong>se designs in a variety<br />

of environments are investigated, should clarify <strong>the</strong> costs and benefits. For<br />

this, we need experiments with agent architectures that complement <strong>the</strong>oretical,<br />

functional analyses of control systems by systematic studies of<br />

performance–cost tradeoffs, which will reveal utility or disadvantages of various<br />

forms of control in various environments.<br />

Finally, <strong>the</strong> main utility in AI of control systems producing states conforming<br />

to our suggested definition of emotional does not lie in systems that<br />

need to interact with humans or animals (e.g., by recognizing emotions in<br />

o<strong>the</strong>rs and displaying emotions to o<strong>the</strong>rs). <strong>The</strong>re is no reason to believe that<br />

such control mechanisms (where something can modulate or override <strong>the</strong><br />

normal behavior of something else) are necessary to achieve “believable interactions”<br />

among artifacts and humans. Large sets of condition–action rules,<br />

for example, may produce convincing behavioral expressions that give <strong>the</strong><br />

appearance of sympathy or surprise without implementing <strong>the</strong> kinds of control<br />

mechanism that we called “emotional.” Hence, such systems may appear<br />

to be emotional without actually having emotions in our sense, but<br />

appearances will suffice for many applications, especially in computer games<br />

and entertainments, as <strong>the</strong>y do in human stage performances and in cartoon<br />

films.<br />

In contrast, control mechanisms capable of producing states conforming<br />

to our proposed definition of emotional will be useful in systems that need<br />

to cope with dynamically changing, partly unpredictable and unobservable<br />

situations where prior knowledge is insufficient to cover all possible outcomes.<br />

Specifically, noisy and/or faulty sensors, inexact effectors, and insufficient<br />

time to carry out reasoning processes are all limiting factors with which<br />

real-world, real-time systems have to deal. As argued in Simon (1967/1979)<br />

and Sloman & Croucher (1981), architectures for such systems will require<br />

mechanisms able to deal with unexpected situations. In part, this trivializes<br />

<strong>the</strong> claim that emotional controls are useful since <strong>the</strong>y turn out to be instances<br />

of very general requirements that are obvious to engineers who have<br />

to design robust and “failsafe” systems to operate in complex environments.<br />

What is nontrivial is which systems are useful in different sorts of architectures<br />

and why.<br />

<strong>The</strong>re is much work in computer science and robotics that deals with<br />

control systems that have some features in common with what we call affective<br />

mechanisms, from real-time operating systems that use timers and<br />

alarm mechanisms to achieve time-critical tasks to robot control systems that

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!