12.12.2012 Views

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

Who Needs Emotions? The Brain Meets the Robot

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

“edison” and “russell” 5<br />

EDISON: If a particular emotion depends on consciousness, <strong>the</strong>n a roboticist<br />

will have to think of what consciousness means for that particular robot.<br />

This will force <strong>the</strong> making of (necessarily simplifying) hypo<strong>the</strong>ses that<br />

will go back to neuroscientists and force <strong>the</strong>m to define consciousness.<br />

But how useful is a general statement such as “fear includes feelings, and<br />

hence consciousness”? Such a statement hides so many exceptions and<br />

particulars. Anyway, as a congressman once said “I do not need to define<br />

pornography, I know it when I see it.” Wouldn’t this apply to (human)<br />

emotions? I would argue that ra<strong>the</strong>r than defining emotion or motivation<br />

or feelings, we should instead ask for a clear explanation for what <strong>the</strong><br />

particular emotion/motivation/feeling is “for” and ask for an operational<br />

view.<br />

RUSSELL: All I ask is enough specificity to allow meaningful comparison<br />

between different approaches to humans, animals, and machines. Asking<br />

what an emotion/motivation/feeling is for is a fine start, but I do not<br />

think it will get you far! One still needs to ask “Do all your examples of<br />

emotion include feelings or not?” And if <strong>the</strong>y include feelings, how can<br />

you escape discussions of consciousness?<br />

EDISON: Why is this a need? <strong>The</strong> answer is very likely to be “no,” and <strong>the</strong>n<br />

what?<br />

RUSSELL: You say you want to be “operational,” but note that for <strong>the</strong><br />

animal <strong>the</strong> operations include measurements of physiological and<br />

neurophysiological data, while human data may include not only comparable<br />

measurements (GSR, EEG, brain scans, etc.) but also verbal<br />

reports. Which of <strong>the</strong>se measurements and reports are essential to <strong>the</strong><br />

author’s viewpoint? Are biology and <strong>the</strong> use of language irrelevant to our<br />

concerns? If <strong>the</strong>y are relevant (and of course <strong>the</strong>y are!), how do we<br />

abstract from <strong>the</strong>se criteria those that make <strong>the</strong> discussion of emotion/<br />

motivation in machines nontrivial?<br />

EDISON: It occurs to me that our difference of view could be essentially<br />

technical: I certainly have an engineering approach to <strong>the</strong> problem of<br />

emotion (“just do it, try things out with biology as guidance, generate<br />

hypo<strong>the</strong>ses, build <strong>the</strong> machine and see if/how it works . . .”), while you<br />

may have a more <strong>the</strong>oretical approach (“first crisply define what you<br />

mean, and <strong>the</strong>n implement <strong>the</strong> definition to test/refine it”)?<br />

RUSSELL: I would ra<strong>the</strong>r say that I believe in dialectic. A <strong>the</strong>ory rooted in<br />

too small a domain may rob us of general insights. Thus, I am not<br />

suggesting that we try to find <strong>the</strong> one true definition of emotion a priori,<br />

only that each of us should be clear about what we think we mean or, if<br />

you prefer, about <strong>the</strong> ways in which we use key terms. <strong>The</strong>n we can<br />

move on to shared definitions and refine our thinking in <strong>the</strong> process. I<br />

think that mere tinkering can make <strong>the</strong> use of terms like emotion or fear<br />

vacuous.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!