12.12.2012 Views

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

204 robots<br />

case of CogAff, conjectured as a type of architecture that can explain or<br />

replicate human mental phenomena. We show how <strong>the</strong> concepts that are<br />

definable in terms of such architectures can clarify and enrich research<br />

on human emotions. If successful for <strong>the</strong> purposes of science and philosophy,<br />

<strong>the</strong> architecture is also likely to be useful for engineering purposes,<br />

though many engineering goals can be achieved using shallow concepts<br />

and shallow <strong>the</strong>ories, e.g., producing “believable” agents for computer<br />

entertainments. <strong>The</strong> more human-like robot emotions will emerge, as <strong>the</strong>y<br />

do in humans, from <strong>the</strong> interactions of many mechanisms serving different<br />

purposes, not from a particular, dedicated “emotion mechanism.”<br />

Many confusions and ambiguities bedevil discussions of emotion.<br />

As a way out of this, we present a view of mental phenomena, in general,<br />

and <strong>the</strong> various sorts of things called “emotions,” in particular, as states<br />

and processes in an information-processing architecture. <strong>Emotions</strong> are a<br />

subset of affective states. Since different animals and machines can have<br />

different kinds of architecture capable of supporting different varieties of<br />

state and process, <strong>the</strong>re will be different families of such concepts, depending<br />

on <strong>the</strong> architecture. For instance, if human infants, cats, or robots lack<br />

<strong>the</strong> sort of architecture presupposed by certain classes of states (e.g., obsessive<br />

ambition, being proud of one’s family), <strong>the</strong>n <strong>the</strong>y cannot be in those<br />

states. So <strong>the</strong> question of whe<strong>the</strong>r an organism or a robot needs emotions<br />

or needs emotions of a certain type reduces to <strong>the</strong> question of what sort of<br />

information-processing architecture it has and what needs arise within such<br />

an architecture.<br />

NEEDS, FUNCTIONS, AND FUNCTIONAL STATES<br />

<strong>The</strong> general notion of X having a need does not presuppose a notion of goal<br />

or purpose but merely refers to necessary conditions for <strong>the</strong> truth of some<br />

statement about X, P(X). In trivial cases, P(X) could be “X continues to exist,”<br />

and in less trivial cases, something like “X grows, reproduces, avoids or repairs<br />

damage.” All needs are relative to something for which <strong>the</strong>y are necessary<br />

conditions. Some needs are indirect insofar as <strong>the</strong>y are necessary for<br />

something else that is needed for some condition to hold. A need may also<br />

be relative to a context since Y may be necessary for P(X) only in some contexts.<br />

So “X needs Y” is elliptical for something like “<strong>The</strong>re is a context, C,<br />

and <strong>the</strong>re is a possible state of affairs, P(X), such that, in C, Y is necessary

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!