12.12.2012 Views

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

234 robots<br />

<strong>The</strong> majority view in this volume seems to be that we need explanatory<br />

<strong>the</strong>ories that include <strong>the</strong>oretical entities whose properties may not be directly<br />

detectable, at least using <strong>the</strong> methods of <strong>the</strong> physical sciences or <strong>the</strong> measurements<br />

familiar to psychologists (including button-pushing events, timings,<br />

questionnaire results, etc.). This is consistent with <strong>the</strong> generic definition of<br />

emotion proposed in this chapter, based on internal processes that are capable<br />

of modulating o<strong>the</strong>r processes (i.e., initiating or interrupting <strong>the</strong>m, changing<br />

parameters that give rise to dispositional changes, etc.). Such a definition should<br />

be useful both for psychologists interested in <strong>the</strong> study of human emotions<br />

and for engineers implementing deep emotional control systems for robots or<br />

virtual agents. While <strong>the</strong> definition was not intended to cover all aspects of<br />

<strong>the</strong> ordinary use of <strong>the</strong> word emotion (nor could it cover <strong>the</strong>m all given that<br />

“emotion” is a cluster concept), it can be used as a guideline that determines<br />

<strong>the</strong> minimal set of architectural features necessary to implement emotions (as<br />

defined in this paper). Fur<strong>the</strong>rmore, it allows us to determine whe<strong>the</strong>r a given<br />

architecture is capable of implementing such emotions and, if so, of what kinds<br />

(as different emotion terms are defined using architectural features). This is<br />

different from much research in AI, where it is merely taken as obvious that<br />

a system of a certain sort is indeed emotional.<br />

More importantly, our definition also suggests possible roles of mechanisms<br />

that generate what are described as “emotions” in agent architectures<br />

(e.g., as interrupt controllers, process modifiers, action initiators or suppressors,<br />

etc.) and, hence, when and where it is appropriate and useful to employ<br />

such control systems. This is crucial for a general understanding of <strong>the</strong><br />

utility of what is often referred to as “emotional control” and consequently<br />

<strong>the</strong> adaptive advantage of <strong>the</strong> underlying mechanisms in biological systems,<br />

even though many of <strong>the</strong> emotions <strong>the</strong>y produce may be dysfunctional.<br />

Do <strong>Robot</strong>s Need <strong>Emotions</strong> and Why?<br />

One of <strong>the</strong> questions some robot designers address is whe<strong>the</strong>r <strong>the</strong>re is any<br />

principled reason why <strong>the</strong>ir robots need emotions to perform a given task<br />

(assuming some clear definition of emotion). However, <strong>the</strong>re is a more general<br />

question: whe<strong>the</strong>r <strong>the</strong>re is any task that cannot be performed by a system<br />

that is not capable of having emotional states.<br />

<strong>The</strong> answer to this question is certainly nontrivial in <strong>the</strong> general case.<br />

For simple control systems satisfying a particular definition of emotional, it<br />

may be possible to define a finite-state machine that has exactly <strong>the</strong> same<br />

input–output behavior but does not instantiate any emotion in <strong>the</strong> specified<br />

sense. Most so-called emotional agents currently developed in AI would<br />

probably fall under this category.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!