03.05.2014 Views

First Suggestions for an Emotion Annotation and ... - OFAI

First Suggestions for an Emotion Annotation and ... - OFAI

First Suggestions for an Emotion Annotation and ... - OFAI

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

MAPPING EMOTION REPRESENTATIONS<br />

The reason why EARL previews the use of different emotion<br />

representations is that no preferred representation has<br />

yet emerged <strong>for</strong> all types of use. Instead, the most profitable<br />

representation to use depends on the application. Still, it<br />

may be necessary to convert between different emotion<br />

representations, e.g. to enable components in a multi-modal<br />

generation system to work together even though they use<br />

different emotion representations [8].<br />

For that reason, EARL will be complemented with a<br />

mech<strong>an</strong>ism <strong>for</strong> mapping between emotion representations.<br />

From a scientific point of view, it will not always be possible<br />

to define such mappings. For example, the mapping<br />

between categories <strong>an</strong>d dimensions will only work in one<br />

direction. <strong>Emotion</strong> categories, understood as short labels <strong>for</strong><br />

complex states, c<strong>an</strong> be located on emotion dimensions representing<br />

core properties; but a position in emotion dimension<br />

space is ambiguous with respect to m<strong>an</strong>y of the specific<br />

properties of emotion categories, <strong>an</strong>d c<strong>an</strong> thus only be<br />

mapped to generic super-categories. Guidelines <strong>for</strong> defining<br />

scientifically me<strong>an</strong>ingful mappings will be provided.<br />

OUTLOOK<br />

We have presented the expressive power of the EARL<br />

specification as it is currently conceived. Some specifications<br />

are still suboptimal, such as the representation of the<br />

start <strong>an</strong>d end times, or the fact that regulation types c<strong>an</strong>not<br />

be associated a numerical degree (e.g., degree of simulation).<br />

Other aspects may be missing but will be required by<br />

users, such as the <strong>an</strong>notation of the object of <strong>an</strong> emotion or<br />

the situational context. The current design choices c<strong>an</strong> be<br />

questioned, e.g. more clarity could be gained by replacing<br />

the current flat list of attributes <strong>for</strong> categories, dimensions<br />

<strong>an</strong>d appraisals with a substructure of elements. On the other<br />

h<strong>an</strong>d, this would increase the <strong>an</strong>notation overhead, especially<br />

<strong>for</strong> simple <strong>an</strong>notations, which in practice may be the<br />

most frequently used. An iterative procedure of comment<br />

<strong>an</strong>d improvement is needed be<strong>for</strong>e this l<strong>an</strong>guage is likely to<br />

stabilise into a <strong>for</strong>m suitable <strong>for</strong> a broad r<strong>an</strong>ge of applications.<br />

The suggestions outlined in this paper have been elaborated<br />

in a detailed specification, currently submitted <strong>for</strong> comment<br />

within HUMAINE. Release of a first public draft is previewed<br />

<strong>for</strong> June 2006. We are investigating opportunities<br />

<strong>for</strong> promoting the st<strong>an</strong>dardisation of the EARL as a recommended<br />

representation <strong>for</strong>mat <strong>for</strong> emotional states in technological<br />

applications.<br />

ACKNOWLEDGMENTS<br />

We gratefully acknowledge the numerous constructive<br />

comments we received from HUMAINE particip<strong>an</strong>ts.<br />

Without them, this work would not have been possible.<br />

This research was supported by the EU Network of Excellence<br />

HUMAINE (IST 507422) <strong>an</strong>d by the Austri<strong>an</strong> Funds<br />

<strong>for</strong> Research <strong>an</strong>d Technology Promotion <strong>for</strong> Industry (FFF<br />

808818/2970 KA/SA). <strong>OFAI</strong> is supported by the Austri<strong>an</strong><br />

Federal Ministry <strong>for</strong> Education, Science <strong>an</strong>d Culture <strong>an</strong>d by<br />

the Austri<strong>an</strong> Federal Ministry <strong>for</strong> Tr<strong>an</strong>sport, Innovation <strong>an</strong>d<br />

Technology.<br />

This publication reflects only the authors' views. The Europe<strong>an</strong><br />

Union is not liable <strong>for</strong> <strong>an</strong>y use that may be made of<br />

the in<strong>for</strong>mation contained herein.<br />

REFERENCES<br />

1. Scherer, K. et al., 2005. Proposal <strong>for</strong> exemplars <strong>an</strong>d<br />

work towards them: Theory of emotions. HUMAINE<br />

deliverable D3e, http://emotion-research.net/deliverables<br />

2. Ekm<strong>an</strong>, P. (1999). Basic emotions. In Tim Dalgleish <strong>an</strong>d<br />

Mick J. Power (Ed.), H<strong>an</strong>dbook of Cognition & <strong>Emotion</strong><br />

(pp. 301–320). New York: John Wiley.<br />

3. Douglas-Cowie, E., L. Devillers, J-C. Martin, R. Cowie,<br />

S. Savvidou, S. Abrili<strong>an</strong>, <strong>an</strong>d C. Cox (2005). Multimodal<br />

Databases of Everyday <strong>Emotion</strong>: Facing up to Complexity.<br />

In Proc. InterSpeech, Lisbon, September 2005.<br />

4. Steidl, S., Levit, M., Batliner, A., Nöth, E., & Niem<strong>an</strong>n,<br />

H. (2005). "Of all things the measure is m<strong>an</strong>" - automatic<br />

classification of emotions <strong>an</strong>d inter-labeler consistency.<br />

ICASSP 2005, International Conference on<br />

Acoustics, Speech, <strong>an</strong>d Signal Processing, March 19-23,<br />

2005, Philadelphia, U.S.A., Proceedings (pp. 317--320).<br />

5. Scherer, K.R. (2000). Psychological models of emotion.<br />

In J. C. Borod (Ed.), The Neuropsychology of <strong>Emotion</strong><br />

(pp. 137–162). New York: Ox<strong>for</strong>d University Press.<br />

6. Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon,<br />

E., Sawey, M., & Schröder, M. (2000). 'FEELTRACE':<br />

An instrument <strong>for</strong> recording perceived emotion in real<br />

time, ISCA Workshop on Speech <strong>an</strong>d <strong>Emotion</strong>, Northern<br />

Irel<strong>an</strong>d , p. 19-24.<br />

7. Ellsworth, P.C., & Scherer, K. (2003). Appraisal processes<br />

in emotion. In Davidson R.J. et al. (Ed.), H<strong>an</strong>dbook<br />

of Affective Sciences (pp. 572-595). Ox<strong>for</strong>d New<br />

York: Ox<strong>for</strong>d University Press.<br />

8. Krenn, B., Pirker, H., Grice, M., Piwek, P., Deemter,<br />

K.v., Schröder, M., Klesen, M., & Gstrein, E. (2002).<br />

Generation of multimodal dialogue <strong>for</strong> net environments.<br />

Proceedings of Konvens. Saarbrücken, Germ<strong>an</strong>y.<br />

9. Aylett, R.S. (2004) Agents <strong>an</strong>d affect: why embodied<br />

agents need affective systems Invited paper, 3rd Hellenic<br />

Conference on AI, Samos, May 2004 Springer<br />

Verlag LNAI 3025 pp496-504<br />

10.de Carolis, B., C. Pelachaud, I. Poggi, M. Steedm<strong>an</strong><br />

(2004).APML, a Mark-up L<strong>an</strong>guage <strong>for</strong> Believable Behavior<br />

Generation, in H. Prendinger, Ed, Life-like Characters.<br />

Tools, Affective Functions <strong>an</strong>d Applications,<br />

Springer.<br />

11.Kipp, M. (2004). Gesture Generation by Imitation -<br />

From Hum<strong>an</strong> Behavior to Computer Character Animation.<br />

Boca Raton, Florida: Dissertation.com.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!