21.01.2015 Views

Developing an Affective Evaluation Tool - Katherine Isbister

Developing an Affective Evaluation Tool - Katherine Isbister

Developing an Affective Evaluation Tool - Katherine Isbister

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

The Sensual <strong>Evaluation</strong> Instrument:<br />

<strong>Developing</strong> <strong>an</strong> <strong>Affective</strong> <strong>Evaluation</strong> <strong>Tool</strong><br />

<strong>Katherine</strong> <strong>Isbister</strong> 1 , Kia Höök 2 , Michael Sharp 1 , Jarmo Laaksolahti 2<br />

1 Rensselaer Polytechnic Institute<br />

2 SICS<br />

110 8 th Street, Sage 4208<br />

Box 1263<br />

Troy, New York 12180 U.S.A.<br />

S-164 28 Kista, Sweden<br />

{isbisk,sharpd}@rpi.edu<br />

{kia,jarmo}@sics.se<br />

+1 518 276 8262<br />

ABSTRACT<br />

In this paper we describe the development <strong>an</strong>d initial testing<br />

of a tool for self-assessment of affect while interacting with<br />

computer systems: the Sensual <strong>Evaluation</strong> Instrument. We<br />

discuss our research approach within the context of existing<br />

affective <strong>an</strong>d HCI theory, <strong>an</strong>d describe stages of evolution<br />

of the tool, <strong>an</strong>d initial testing of its effectiveness.<br />

Author Keywords<br />

<strong>Affective</strong> interaction, User-centered design, ambiguity,<br />

gesture-based interaction, embodiment.<br />

ACM Classification Keywords<br />

H.5.2 User interfaces<br />

INTRODUCTION<br />

Practitioners in the CHI community have become<br />

increasingly convinced of the import<strong>an</strong>ce of affect—both in<br />

terms of designing experiences for users which take it into<br />

account [18, 16], <strong>an</strong>d also in terms of developing measures<br />

for user satisfaction with such systems [10].<br />

There are various modes of approach to measuring affect.<br />

Traditionally, affect has been ascertained in two primary<br />

ways: using questionnaires administered after <strong>an</strong><br />

experience, which ask the user to rate his/her feelings about<br />

what occurred, <strong>an</strong>d <strong>an</strong>alysis of videotaped sessions with<br />

users that typically combine interpretation of think-aloud<br />

commentary with deciphering of other cues of emotion<br />

(smiling, gestures <strong>an</strong>d the like) to develop <strong>an</strong> impression of<br />

user’s affective reactions. In recent years, additional tools<br />

based upon biometrics have evolved—measuring galv<strong>an</strong>ic<br />

skin response, detecting small movements of the muscles of<br />

the face, tracking pressure on the mouse [19]. These signals<br />

Permission to make digital or hard copies of all or part of this work for<br />

personal or classroom use is gr<strong>an</strong>ted without fee provided that copies are<br />

not made or distributed for profit or commercial adv<strong>an</strong>tage <strong>an</strong>d that copies<br />

bear this notice <strong>an</strong>d the full citation on the first page. To copy otherwise,<br />

or republish, to post on servers or to redistribute to lists, requires prior<br />

specific permission <strong>an</strong>d/or a fee.<br />

CHI 2006, April 22-27, 2006, Montréal, Québec, C<strong>an</strong>ada.<br />

Copyright 2006 ACM 1-59593-178-3/06/0004...$5.00.<br />

provide a complex trace of the user’s affect which m<strong>an</strong>y in<br />

our community (<strong>an</strong>d in the psychological <strong>an</strong>d physiological<br />

communities) are still working to underst<strong>an</strong>d adequately.<br />

Our own efforts in developing <strong>an</strong> affective assessment tool<br />

have been shaped by our past work in underst<strong>an</strong>ding <strong>an</strong>d<br />

enh<strong>an</strong>cing nonverbal communication, our commitment to<br />

user-centered design, <strong>an</strong>d our interest in tr<strong>an</strong>s-cultural<br />

methods to support global design practice. Our research<br />

approach arises from several baseline principles: 1) a belief<br />

that freeing the person giving us affective feedback from<br />

words, <strong>an</strong>d incorporating the use of their body to give<br />

feedback, will be beneficial; 2) the belief that self-report is<br />

a valuable practice, as part of a user-centered design<br />

approach; 3) <strong>an</strong> interest in creating a flexible <strong>an</strong>d portable<br />

tool that c<strong>an</strong> be widely used, that could tr<strong>an</strong>scend l<strong>an</strong>guage<br />

<strong>an</strong>d culture; 4) the desire to create <strong>an</strong> expressive, openended<br />

tool that does not remove all ambiguity from<br />

communication by the user but rather offers a r<strong>an</strong>ge of<br />

expressive possibilities.<br />

Our development process was a user-centered one,<br />

conducted in several stages, with input along the way: <strong>an</strong><br />

initial brainstorming stage; early, open-ended concept<br />

testing; refinement of the research instrument; <strong>an</strong>d openended<br />

testing of the prototype tool.<br />

CONTEXT<br />

This project emerged from a Europe<strong>an</strong> Network of<br />

Excellence initiative called Humaine (http://emotionresearch.net/),<br />

focused on the study of affective systems.<br />

Our work group in the project is dedicated to exploring<br />

ways <strong>an</strong>d me<strong>an</strong>s for evaluating a system’s affective impact<br />

on users. As we gathered sources <strong>an</strong>d learned about existing<br />

methods, we found that there was still much work to do in<br />

refining methods for eliciting <strong>an</strong>d underst<strong>an</strong>ding user affect.<br />

In particular, there seemed to be a gap in terms of easy to<br />

use self-report measures. System designers were either<br />

relying on questionnaires given after the interaction was<br />

complete, were laboriously coding data from videotape of<br />

sessions, or were using biometric equipment that requires<br />

relatively complex <strong>an</strong>alysis in order to deliver reliable<br />

affective data. Was it possible to create a light-weight, selfreport<br />

measure that could give reasonable in-line feedback<br />

1163


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

from the user as the experience was unfolding We felt this<br />

was worth exploring, especially insofar as it could enh<strong>an</strong>ce<br />

the design process by enabling more rapid test-<strong>an</strong>d-iterate<br />

cycles based upon affective responses to a system in<br />

development.<br />

It is import<strong>an</strong>t to reiterate here that we w<strong>an</strong>ted this tool to<br />

allow flexibility for users to adapt it to their own style of<br />

expressing affect, toward creating a dialog between<br />

designers <strong>an</strong>d users that enh<strong>an</strong>ces design. While we did aim<br />

for some level of consistency in response to the<br />

instrument’s qualities, we were not solely focused on<br />

consistent use patterns nor would we claim that this is a<br />

highly accurate tool for measuring affect. In this regard the<br />

aims of the project differ radically from other self-report<br />

measures, <strong>an</strong>d our instrument should be viewed as<br />

complementary to rather th<strong>an</strong> as <strong>an</strong> alternative to other<br />

measures.<br />

In Support of a Body-based, Nonverbal Approach<br />

Neurophysiologists <strong>an</strong>d psychologists have in recent years<br />

proposed that our brains, rather th<strong>an</strong> operating in a wholly<br />

logical, conscious verbal m<strong>an</strong>ner, actually process<br />

information <strong>an</strong>d make decisions using various layers<br />

working in parallel, complementary ways. They have<br />

demonstrated, for example, that we c<strong>an</strong> learn something<br />

new <strong>an</strong>d ‘intuitively’ put it into action before we are able to<br />

consciously verbalize it [15]. <strong>Affective</strong> processing, in<br />

particular, has been shown to occur at levels other th<strong>an</strong> the<br />

cognitive/word-oriented level of the brain (e.g. the primal<br />

nature of fear: see 14). Emotions are experienced by both<br />

body <strong>an</strong>d mind. Often, they are evoked by sub-symbolic<br />

stimuli, such as colors, shapes, gestures, or music.<br />

If we rely on the verbal ch<strong>an</strong>nel for self-report of affect<br />

during interaction with a system, we may be missing out on<br />

much of this information, as it is filtered through the<br />

person’s verbal system. The authors [6, 11] have explored<br />

the role of nonverbal expression in interface in the past, <strong>an</strong>d<br />

have found the body to be <strong>an</strong> import<strong>an</strong>t ch<strong>an</strong>nel for<br />

communicating affect <strong>an</strong>d other qualities. We decided that<br />

our affective instrument would be sensual <strong>an</strong>d physical in<br />

nature, to allow us to access this part of a person’s<br />

experience more directly. In addition, we hoped that<br />

avoiding verbalization would make it more likely that users<br />

could give in-line feedback without their task being as<br />

disrupted as if they had to explicitly verbalize their current<br />

affect.<br />

User-centered Design Values<br />

Höök in particular has had a long-st<strong>an</strong>ding commitment to<br />

user-centered design practice, <strong>an</strong>d finding ways to bring<br />

users into the design process earlier in development cycles<br />

to strengthen final design outcomes. To the extent that a<br />

usability practice is intuitive <strong>an</strong>d simple to operate <strong>an</strong>d<br />

<strong>an</strong>alyze, it becomes more likely to be adopted for use<br />

during development, especially during early stages when<br />

signific<strong>an</strong>t ch<strong>an</strong>ge is still possible. Höök’s experience has<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

been that tools which have a hope of being used well in<br />

early design practice need to be flexible—allowing for<br />

vari<strong>an</strong>ce in user’s approaches <strong>an</strong>d mental models of what<br />

they are doing, <strong>an</strong>d r<strong>an</strong>ges of mode of expression. A<br />

modicum of ambiguity in the tool at this stage is good<br />

rather th<strong>an</strong> bad, as it allows users to co-define the me<strong>an</strong>ing<br />

of their feedback with the designer, to reach a richer shared<br />

picture of what is going on [7]. This is especially<br />

appropriate for a tool to measure affect—models of emotion<br />

from the realm of psychology are not always fully<br />

descriptive of (or useful in the context of) measuring<br />

affective reaction to a system. Some emotions rarely come<br />

into play when engaging with computer systems, <strong>an</strong>d the<br />

ebb <strong>an</strong>d flow of reaction to a system is not necessarily the<br />

same as that which occurs in face to face interaction among<br />

people. Part of the hidden work of this project, then, was<br />

learning about which emotions are truly relev<strong>an</strong>t in the<br />

context of evolving successful interaction with computer<br />

systems, <strong>an</strong>d offering ways to express those emotions with<br />

our research instrument.<br />

Practical Considerations: A Portable, Flexible, Crossculturally<br />

Valid Instrument<br />

By avoiding verbalization, we also realized that we had a<br />

ch<strong>an</strong>ce of developing a less culturally dependent research<br />

instrument. Questionnaires must normally be tr<strong>an</strong>slated <strong>an</strong>d<br />

counter-tr<strong>an</strong>slated to insure that they are truly measuring<br />

the target emotion, even though there is strong evidence for<br />

tr<strong>an</strong>s-cultural similarities in basic emotional response.<br />

Perhaps a nonverbal, body-based approach could tap more<br />

directly into these shared responses, saving designers time<br />

<strong>an</strong>d energy, <strong>an</strong>d again, leading to greater likelihood of early<br />

user participation in design. We were inspired by the work<br />

of Liz S<strong>an</strong>ders [21], Bill Gaver [7, 8, 4], <strong>an</strong>d others, in<br />

conducting inquiries that move freely beyond the lab <strong>an</strong>d<br />

into various cultural settings. Thus from the beginning, one<br />

end goal of our project was the creation of a ‘kit’ that could<br />

be easily shipped to others around the world for use.<br />

And Finally, in Support of Fun<br />

We subscribe to the notion of ‘Functional Aesthetics’ [4],<br />

<strong>an</strong>d would like to extend this notion to the design of<br />

methodologies as well as end products. Particularly in the<br />

study of affect, we feel the experience of the feedbackgiving<br />

should be pleas<strong>an</strong>t in <strong>an</strong>d of itself.<br />

APPROACH<br />

We developed the prototype of the Sensual <strong>Evaluation</strong><br />

Instrument after some initial exploratory study of the<br />

possibility space.<br />

Stage One: Exploratory Design <strong>an</strong>d Iteration<br />

We beg<strong>an</strong> by exploring work-to-date on nonverbal selfreport<br />

measures. There was some history of the<br />

st<strong>an</strong>dardization <strong>an</strong>d use of nonverbal scales in psychology<br />

(e.g. PONS (the Profile of Nonverbal Sensitivity) [3], from<br />

which we could draw lessons.<br />

1164


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

Most work done on nonverbal systems of evaluation has<br />

involved <strong>an</strong>thropomorphic imagery (likenesses of hum<strong>an</strong><br />

faces <strong>an</strong>d/or bodies). For example, researchers who work<br />

with children have established the value <strong>an</strong>d reliability of<br />

face-based Likert scales for determining emotional<br />

responses to systems <strong>an</strong>d situations (e.g. Wong <strong>an</strong>d Baker’s<br />

work on children’s subjective evaluation of pain—see<br />

Figure 1) [24].<br />

Figure 1. A pain scale used to help children indicate their level<br />

of discomfort.<br />

There are also popular uses of nonverbal affective scales<br />

(thumbs-up <strong>an</strong>d down movie ratings, the ‘little m<strong>an</strong>’ on the<br />

S<strong>an</strong> Fr<strong>an</strong>cisco Chronicle movie review page<br />

(http://www.sfgate.com/eguide/movies/reviews/ see Figure<br />

2), indicating that calibration <strong>an</strong>d use of nonverbal scales is<br />

possible <strong>an</strong>d appealing in everyday contexts.<br />

Figure 2. The S<strong>an</strong> Fr<strong>an</strong>cisco Chronicle’s movie review system<br />

uses facial expression <strong>an</strong>d body posture to indicate a movie’s<br />

quality.<br />

Finally, there has been some work in the product design<br />

community on mapping product qualities to affective<br />

reactions, for example this facial wheel used by Wensveen<br />

<strong>an</strong>d colleagues in developing <strong>an</strong> affectively appropriate<br />

alarm clock (Figure 3, see [22,23]).<br />

Our wish in this project was to move away from discrete,<br />

pre-defined emotional labels such as faces, figures, or<br />

names, <strong>an</strong>d move toward some form of nonverbal code that<br />

allowed more open-ended interpretation but that still<br />

evoked emotion without explicitly representing the hum<strong>an</strong><br />

form. We also hoped to extend the sensory experience of<br />

this instrument beyond the purely visual.<br />

Figure 3. Facial expressions arr<strong>an</strong>ged along the axes of arousal<br />

<strong>an</strong>d valence, used to help product designers gauge user<br />

emotions about designs.<br />

There is some nonrepresentational work on sensing emotion<br />

in alternate sensory ch<strong>an</strong>nels. For example in his book<br />

Sentics, Clynes describes characteristic movement patterns<br />

on a touch pad when users are asked to ‘perform’ a<br />

particular emotion with their finger [1]. Fagerberg et al.’s<br />

work on eMoto, a non-representational system for adding<br />

affective content to SMS messages on mobile phones, is <strong>an</strong><br />

example of the use of gesture <strong>an</strong>d touch to generate <strong>an</strong><br />

emotional response [6]. Product designers know that<br />

surface materials <strong>an</strong>d their tactile qualities c<strong>an</strong> profoundly<br />

impact users’ emotional response to products [9], but there<br />

has been little systematic work done up to know to asses the<br />

specific emotional effects of various materials.<br />

Method<br />

Exploration One: Personal Home Objects<br />

To begin the project, we decided to apply one traditional<br />

model of emotion to everyday household objects, to see<br />

what sort of physical properties caused them to be arrayed<br />

in which places in the emotional spectrum. We worked<br />

from the Russell circle [20], which arrays emotions along<br />

two axes: arousal <strong>an</strong>d valence. A high arousal, high valence<br />

emotion such as ecstatic joy would be located in the upper<br />

right quadr<strong>an</strong>t, for example, whereas a low arousal, low<br />

valence emotion such as depression would be located in the<br />

lower left quadr<strong>an</strong>t. We had several researchers bring in<br />

objects that had emotional me<strong>an</strong>ing for them, <strong>an</strong>d array<br />

them upon a circle projected onto the floor (see Figure 4)<br />

Findings<br />

This early experiment revealed m<strong>an</strong>y properties of objects<br />

that help to evoke emotion: material, shape, ‘give’ in the<br />

h<strong>an</strong>d, color, size, as well as memories associated with the<br />

object.<br />

1165


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

Figure 4. Household objects arrayed on the Russell circle<br />

Exploration Two: Colored Cylinders<br />

We decided to isolate one dimension of the m<strong>an</strong>y object<br />

properties we found in step one, <strong>an</strong>d test how people would<br />

use this dimension to convey their emotions while<br />

interacting with a system. We chose color as our first<br />

variable, as color had strong connotations in the first test<br />

<strong>an</strong>d with other projects by Höök <strong>an</strong>d colleagues. We crafted<br />

colored objects that could be held in the h<strong>an</strong>d, <strong>an</strong>d which<br />

were similar in weight, texture <strong>an</strong>d other properties,<br />

differing only in color (see Figure 5). Then we brought in<br />

several people not familiar with our research, <strong>an</strong>d asked<br />

them to do three things:<br />

1. Create their own taxonomy of the objects, using the<br />

projected Russell circle (as we did in the initial test).<br />

2. Interact with a computer game that evoked strong<br />

emotions, <strong>an</strong>d use the objects to convey how they felt.<br />

3. Discuss with us how it was to use the objects to convey<br />

emotion.<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

everyday sense, so they hadn’t really pl<strong>an</strong>ned how they<br />

would express them.<br />

Based upon these results, we made two key design<br />

decisions:<br />

1. We would move to a more biomorphic framework upon<br />

which to vary the objects, one that we hoped was less<br />

subject to narrative overlays. We drew from work by<br />

Disney <strong>an</strong>imators in crafting less explicitly<br />

<strong>an</strong>thropomorphic forms that still m<strong>an</strong>aged to strong convey<br />

emotion (see Figure 6). We were also influenced by prior<br />

investigation into qualities of movement <strong>an</strong>d its <strong>an</strong>alysis<br />

(e.g. the work of Lab<strong>an</strong>; see [6] for further discussion of<br />

movement <strong>an</strong>alysis <strong>an</strong>d expression of affect.)<br />

2. We would take note of the emotions that users expressed<br />

during this test, <strong>an</strong>d incorporate system-use-based emotions<br />

into our subsequent design thinking. How might one<br />

express mild frustration, confusion, <strong>an</strong>d the like These<br />

were the sorts of feedback we w<strong>an</strong>ted to support with our<br />

instrument.<br />

Figure 6. The famous Disney flour sack, <strong>an</strong> illustration of<br />

emotions conveyed by <strong>an</strong> object despite the absence of a<br />

clearly articulated hum<strong>an</strong> figure [12].<br />

Figure 5. Early sensual evaluation objects arrayed on the<br />

Russell emotional circle.<br />

Findings<br />

Our testing revealed that some people used the colored<br />

objects to create a personal taxonomy of affect, but others<br />

found that their own narrative connotations for colors (e.g.<br />

red <strong>an</strong>d green remind me of Christmas) got in the way of<br />

crafting <strong>an</strong>d re-using a personal affective taxonomy during<br />

the game play. We found that users were experiencing<br />

emotions that didn’t match neatly to their initial taxonomy,<br />

<strong>an</strong>d were not sure how to indicate them—such as mild<br />

frustration, confusion, <strong>an</strong>ticipation, <strong>an</strong>d the like. These were<br />

not what they typically thought of as ‘emotions’ in the<br />

Iteration<br />

At this point we solicited the support of a professional<br />

sculptor who had experience in crafting biomorphic forms.<br />

Rainey Straus (see www.raineystraus.com for more<br />

examples of her work) crafted a set of objects that had<br />

biomorphic qualities, working from ongoing discussions<br />

with us, <strong>an</strong>d from a list of emotions we provided that was<br />

based upon our exploratory studies. The initial list was<br />

confusion, frustration, fear, happiness, surprise, satisfaction,<br />

contentment, stress, <strong>an</strong>d flow. She crafted a total of eight<br />

objects, not necessarily me<strong>an</strong>t to be a one-to-one mapping<br />

to these emotions, but rather a successful set of tools for<br />

evoking/expressing this r<strong>an</strong>ge of emotions (see Figure 7 <strong>an</strong>d<br />

8).<br />

1166


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

Figure 7. The Sensual <strong>Evaluation</strong> Instrument objects.<br />

Straus created the objects in clay, then we had them cast in<br />

plastic in order to have a durable surface <strong>an</strong>d multiples for<br />

use in different lab settings. Internally, we assigned names<br />

to the objects to make it easier to code data, as follows<br />

(follow figure 7): back row—spiky, pseudopod; next row—<br />

<strong>an</strong>teater, bubbly; next row—stone, doubleball, ball; front—<br />

barba papa (named after a figure from a French <strong>an</strong>imated<br />

cartoon popular in Sweden). We did not use these or <strong>an</strong>y<br />

other names for the objects with particip<strong>an</strong>ts—these names<br />

are introduced only to aid you, the reader, in following<br />

discussion of object use in this paper.<br />

Figure 8. An alternate view to give the reader additional visual<br />

information about the contours of the objects.<br />

Stage Two: Open-ended testing of prototype objects<br />

With the cast objects in h<strong>an</strong>d, we designed a study to assess<br />

whether this new instrument would allow people to provide<br />

me<strong>an</strong>ingful affective feedback while they engaged in<br />

interaction with a computer system. The aim of the study<br />

was both to explore use potential for the object set, <strong>an</strong>d also<br />

to engage the students <strong>an</strong>d faculty in a discussion of the act<br />

of using the objects, <strong>an</strong>d potential interactions/innovations<br />

toward a refined design.<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

• Expl<strong>an</strong>ation of research purpose <strong>an</strong>d orientation to the<br />

objects. In this phase, the particip<strong>an</strong>t was shown the<br />

objects <strong>an</strong>d encouraged to h<strong>an</strong>dle them, <strong>an</strong>d the<br />

experimenter described the purpose of the research <strong>an</strong>d<br />

emphasized that <strong>an</strong>y type of usage of the objects to<br />

convey affective state was fine (multiple objects at once,<br />

movement of objects, <strong>an</strong>d so forth).<br />

• Use of objects with a subset of the IAPS (International<br />

<strong>Affective</strong> Picture Set). In this phase, the particip<strong>an</strong>t was<br />

shown a series of images taken from the IAPS [13], <strong>an</strong>d<br />

asked to use the SEI objects to indicate affective response<br />

to each picture. (See the Findings section for examples of<br />

the images used). The images were selected because their<br />

arousal/valence scores mapped roughly to the feelings<br />

that the artist initially intended to convey with the<br />

objects.<br />

• Use of objects with a computer game. In this phase, the<br />

particip<strong>an</strong>t played through the first puzzle in a PC<br />

adventure game, The Curse of Monkey Isl<strong>an</strong>d, <strong>an</strong>d was<br />

instructed to use the objects to indicate affect during play.<br />

The experimenter was present during this phase, <strong>an</strong>d was<br />

available to offer hints/tips on using the game during<br />

play.<br />

• Use of objects during a chat. After the game, the<br />

particip<strong>an</strong>t was asked to chat using AIM inst<strong>an</strong>t<br />

messaging with the experimenter’s assist<strong>an</strong>t, to discuss<br />

how it was to play the game. The experimenter left the<br />

room during this portion, after instructing the particip<strong>an</strong>t<br />

to use the objects to indicate affect while chatting.<br />

• A discussion of how it was to use the objects to give<br />

affective feedback. At the end, the experimenter returned<br />

<strong>an</strong>d walked the particip<strong>an</strong>t through a series of questions<br />

about what it was like to use the objects, including<br />

solicitation of suggestions for improvement of the SEI.<br />

Questions included: What was it like to use the objects to<br />

express emotion Did you find that you had a consistent<br />

set of mappings How hard or easy was it to use them<br />

Any other thoughts Suggestions for ch<strong>an</strong>ges or<br />

alternatives<br />

The sessions were digitally recorded (with particip<strong>an</strong>ts’<br />

consent), using a picture within picture format, so that we<br />

could later <strong>an</strong>alyze usage in conjunction with what the<br />

particip<strong>an</strong>t was seeing on-screen (see Figure 9).<br />

Method<br />

In this stage, we took 12 individual particip<strong>an</strong>ts (10 male, 2<br />

female) through <strong>an</strong> hour-long session, which consisted of<br />

five phases:<br />

1167


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

Figure 9. Particip<strong>an</strong>ts were recorded so that we could see how<br />

they m<strong>an</strong>ipulated the objects, as well as what they were seeing.<br />

Analysis <strong>an</strong>d Findings<br />

Calibration with Images<br />

Considering the images arrayed by the primary dimensions<br />

used in the IAPS (arousal <strong>an</strong>d valence, see Figure 10 for<br />

sample images), we found that there were some visible<br />

trends in the use of objects.<br />

Although there were definitely individual differences in<br />

particip<strong>an</strong>t reactions to the images (for example, one<br />

student liked the shark image <strong>an</strong>d felt very calm <strong>an</strong>d<br />

positive about it as he loves sharks), looking at the object<br />

use reveals some relation to the valence <strong>an</strong>d arousal metrics<br />

compiled for the IAPS (see figures 11 <strong>an</strong>d 12).<br />

Figure 10. Sample high arousal/low valence (above) <strong>an</strong>d lower<br />

arousal/higher valence (below) images selected from the IAPS.<br />

If we consider the IAPS images arr<strong>an</strong>ged from most<br />

negative to most positive valence, we c<strong>an</strong> see that the<br />

objects with the sharp edges tended to be used in<br />

association with the negative valence imagery (bottom of<br />

Figure 11). Particip<strong>an</strong>ts used spiky <strong>an</strong>d <strong>an</strong>teater more<br />

frequently in response to these images. Objects with smooth<br />

edges <strong>an</strong>d few protrusions (ball, stone, barba papa, bubbly)<br />

tended to be used in conjunction with images with more<br />

positive valence (top of Figure 11).<br />

Figure 11. Objects used in response to the IAPS images, arr<strong>an</strong>ged by typical image valence response.<br />

1168


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

Figure 12. Objects used in response to the IAPS images, arr<strong>an</strong>ged by typical image arousal response.<br />

The trends are less clear when it comes to the arousal level<br />

associated with the IAPS images—those with the highest<br />

arousal level as well as negative valence: a gun pointed<br />

toward the viewer or a dog barking at the viewer, were<br />

associated with the sharper edged objects. Mid-level arousal<br />

<strong>an</strong>d positive valence images: the bride with arms out, the<br />

bride with child, were associated with rounded objects.<br />

There was not a clear pattern for low-arousal images (see<br />

Figure 12).<br />

Individual Taxonomies <strong>an</strong>d Use Patterns<br />

in post-discussion. Spiky <strong>an</strong>d <strong>an</strong>teater both got fear <strong>an</strong>d<br />

<strong>an</strong>ger attributions from most particip<strong>an</strong>ts (though one<br />

particip<strong>an</strong>t felt <strong>an</strong>teater should me<strong>an</strong> interest). Barba papa<br />

was often used to indicate humor. Bubbly was used for both<br />

humor <strong>an</strong>d for confusion <strong>an</strong>d frustration. M<strong>an</strong>y particip<strong>an</strong>ts<br />

commented that the bubbly form seemed chaotic <strong>an</strong>d<br />

suggested confusion or indeterminacy. Pseudopod got some<br />

similar reactions—one particip<strong>an</strong>t used this to indicate that<br />

he had a goal in mind but it wasn’t wholly resolved—a<br />

feeling of directed <strong>an</strong>ticipation.<br />

Despite these generally similar tendencies in associating the<br />

objects with particular affective states, particip<strong>an</strong>ts used<br />

them in action in quite different ways.<br />

Figure 13. The Sensual <strong>Evaluation</strong> Instrument objects, for<br />

reference when reading this section.<br />

There were also patterns in usage during the session, <strong>an</strong>d in<br />

discussion afterward that echoed <strong>an</strong>d also extended what<br />

was seen in the object calibration exercise outside the<br />

valence/arousal model. Particip<strong>an</strong>ts tended to describe the<br />

smooth, rounded objects with fewer protrusions as happy,<br />

or calm, or fun in the post-discussion, <strong>an</strong>d to use them when<br />

in such situations during play or chat. The ball was<br />

frequently given a higher energy positive attribution th<strong>an</strong><br />

the stone, which was described as bored or as zenlike calm<br />

Figure 14. A particip<strong>an</strong>t using multiple objects.<br />

This particip<strong>an</strong>t (Figure 14) used multiple objects arrayed in<br />

a tri<strong>an</strong>gular formation to express his reaction. His use of the<br />

objects was limited to those that were most rounded in form<br />

(ball, stone, doubleball, barba papa), <strong>an</strong>d he said he<br />

probably would not use the sharper forms (such as spiky)<br />

unless he saw the ‘blue screen of death’.<br />

1169


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

Use of objects during the session (total time in the game<br />

<strong>an</strong>d chat tasks was typically ~30 minutes total) r<strong>an</strong>ged from<br />

4 to 53 times. Particip<strong>an</strong>ts used the objects to express<br />

emotion both when playing the game with the experimenter<br />

present, <strong>an</strong>d when inst<strong>an</strong>t message chatting with the<br />

experimenter not present in the room, although usage was<br />

less frequent in the latter case (see Figure 17).<br />

Figure 15. A particip<strong>an</strong>t stacking objects.<br />

This particip<strong>an</strong>t (Figure 15) stacked objects in various ways<br />

during his session. He used every object except spiky in his<br />

session.<br />

Figure 18. Pattern of multiple object usage inst<strong>an</strong>ces, per<br />

particip<strong>an</strong>t.<br />

Use of several objects at once varied from person to person,<br />

but overall, multiple use accounted for about 21% of total<br />

inst<strong>an</strong>ces of object selection (see Figure 18).<br />

Figure 16. A particip<strong>an</strong>t gesturing with <strong>an</strong> object.<br />

This particip<strong>an</strong>t (Figure 16) shook the spiked object in<br />

response to a lag problem in the game during play (<strong>an</strong>d at<br />

other times). He tended to hold the objects in his h<strong>an</strong>ds<br />

when they were ‘active’. He made frequent <strong>an</strong>d dramatic<br />

use of the objects including broad gestures—over 50<br />

inst<strong>an</strong>ces in the 30-minute session.<br />

Figure 17. Pattern of object usage inst<strong>an</strong>ces in the two active<br />

tasks.<br />

Anecdotal Discussion<br />

Particip<strong>an</strong>ts gave us both positive <strong>an</strong>d negative feedback<br />

about this method of providing affective information about<br />

using a system. Positive comments included:<br />

“I’d do this over surveys <strong>an</strong>y day. I hate surveys!” (The<br />

particip<strong>an</strong>t explained that he liked being able to give<br />

feedback during as opposed to after, <strong>an</strong>d that he liked<br />

having more th<strong>an</strong> one object <strong>an</strong>d being able to ch<strong>an</strong>ge their<br />

orientation).<br />

“They were fun to play with, just to roll them around in<br />

your h<strong>an</strong>d.” (This particip<strong>an</strong>t had <strong>an</strong> object in h<strong>an</strong>d almost<br />

const<strong>an</strong>tly during the session).<br />

“They were novel <strong>an</strong>d interesting.”<br />

“I like the concept.”<br />

“I think a survey form would be as good, but you’d have to<br />

stop <strong>an</strong>d think about it.”<br />

“I felt like I was very involved.”<br />

“It was fun. Much better th<strong>an</strong> filling out a<br />

survey…feedback is more immediate.”<br />

“In retrospect I suppose they helped me to see when<br />

something wasn’t working because I was like hmm which<br />

object should I use to convey why this won’t work. By the<br />

same token, I knew that I had a good strategy when I was<br />

able to keep the happy object out <strong>an</strong>d I could see the game<br />

was adv<strong>an</strong>cing.”<br />

1170


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

Negative comments:<br />

“I didn’t feel like I got a lot of leverage out of that one<br />

(doubleball).” (Doubleball <strong>an</strong>d pseudopod were objects that<br />

had less reson<strong>an</strong>ce for several particip<strong>an</strong>ts).<br />

“It felt like there could have been more… like vastness. I<br />

would make something larger th<strong>an</strong> the others.” (Several<br />

particip<strong>an</strong>ts indicated ‘missing’ emotions, <strong>an</strong>d a desire for<br />

more objects. Missing emotions included: mel<strong>an</strong>choly, a<br />

sense of accomplishment, hatefulness beyond <strong>an</strong>ger,<br />

disingenuousness, <strong>an</strong>d ever-so-slight frustration).<br />

“This is ambiguous.” (Some expressed concern that the<br />

results would not be clear or definitive enough to use as<br />

data).<br />

Particip<strong>an</strong>ts also made some very interesting suggestions<br />

for evolving the instrument, which included:<br />

• adding the ability to squeeze or shape the objects<br />

somehow with one’s own grip, to further indicate<br />

emotion.<br />

• adding droopy objects to indicate despair or low energy<br />

level.<br />

• introducing scale <strong>an</strong>d texture as variables.<br />

One other factor for consideration in the object use was the<br />

tendency for particip<strong>an</strong>ts sometimes to match the shape of<br />

the object with the shape of the thing they were reacting to.<br />

This was especially so with the IAPS image calibration<br />

(some subjects used the pseudopod or barba papa in<br />

response to a gun image, or the doubleball or bubbly<br />

because it looked like the rounded mountains, <strong>an</strong>d so forth).<br />

This tendency seemed to become less prevalent with the<br />

active system feedback sessions, but is a concern in further<br />

evolution of the shapes themselves.<br />

GENERAL DISCUSSION OF RESULTS<br />

This testing exercise seems to indicate that we are on the<br />

right track in our design of the Sensual <strong>Evaluation</strong><br />

Instrument. Specifically:<br />

• It seems to be fun <strong>an</strong>d engaging for people, in part<br />

because it makes use of the body <strong>an</strong>d sense of touch, <strong>an</strong>d<br />

also because it allows for flexibility in response.<br />

• There are indications (through use patterns <strong>an</strong>d comments<br />

afterward) that this is a feasible way to give in-process<br />

feedback without too much disruption, as we had hoped.<br />

• Use patterns <strong>an</strong>d verbal descriptions seem to indicate that<br />

this could be evolved into a reasonably consistent<br />

instrument in terms of general affective dimensions,<br />

while maintaining a flexibility of expression for<br />

particip<strong>an</strong>ts.<br />

We believe this approach holds promise <strong>an</strong>d are pursuing it<br />

further.<br />

NEXT STEPS<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

Effectiveness for Designers<br />

The first <strong>an</strong>d most import<strong>an</strong>t follow-up step is to establish<br />

that this instrument c<strong>an</strong> provide helpful feedback to a<br />

designer of a system. We are pl<strong>an</strong>ning to conduct user<br />

feedback sessions with designs-in-progress in the coming<br />

year, <strong>an</strong>d are seeking developers interested in using this<br />

method during their design process. In particular, we<br />

suspect that being able to see the nonverbal gestural<br />

feedback as it unfolds will help designers in different <strong>an</strong>d<br />

interesting ways th<strong>an</strong> getting summarized results from other<br />

methods for measuring user affect.<br />

Cultural Tr<strong>an</strong>sfer<br />

One import<strong>an</strong>t claim we’ve made in this paper is that this<br />

instrument could be used across cultures. We have<br />

completed a study replicating the method used in the U.S.<br />

in Sweden, <strong>an</strong>d are <strong>an</strong>alyzing those results, <strong>an</strong>d would like<br />

to solicit interested groups from other cultural backgrounds<br />

to conduct replication studies. We c<strong>an</strong> provide <strong>an</strong> object set<br />

on lo<strong>an</strong> for such studies.<br />

Further Exploration of <strong>Affective</strong> Dimensions in Shape<br />

The patterns in response to the objects created by our<br />

sculptor suggested the following sculptural dimensions of<br />

interest:<br />

• rounded versus spiky (positive to negative valence)<br />

• smooth versus bubbly or protruding surface (low versus<br />

high arousal)<br />

• symmetrical versus asymmetrical (calmer or more<br />

directed/resolved versus confused/chaotic)<br />

We are pl<strong>an</strong>ning to explore iterations of objects based upon<br />

these dimensions (with the addition of the ‘droopiness’<br />

dimension suggested by one particip<strong>an</strong>t), perhaps even<br />

crafting <strong>an</strong> interface that allows people to use sliders to<br />

craft their own affective objects, to give us a better<br />

underst<strong>an</strong>ding of these shape variables <strong>an</strong>d how they relate<br />

to affect (the objects could then be ‘printed’ using a 3-d<br />

printing device <strong>an</strong>d tested in further studies).<br />

Multiple Method Tri<strong>an</strong>gulation<br />

We’ve also made the claim in this paper that this instrument<br />

could act as a helpful supplement to other modes of<br />

detecting/requesting affective state information. We are<br />

working toward a replication of our study that includes<br />

biometric monitoring as well as a post-survey, to compare<br />

results <strong>an</strong>d get a more complete picture of what each<br />

method exposes so far as particip<strong>an</strong>t affect.<br />

Dynamics<br />

Finally, we are also considering embedding sensors to<br />

allow us to record <strong>an</strong>d underst<strong>an</strong>d particip<strong>an</strong>t’s gestures<br />

with the objects (as with the eMoto system [6]).<br />

1171


CHI 2006 Proceedings • Beliefs <strong>an</strong>d Affect<br />

CONCLUSION<br />

Through iteration <strong>an</strong>d prototype testing, we believe we have<br />

demonstrated the promise <strong>an</strong>d potential of the Sensual<br />

<strong>Evaluation</strong> Instrument as a real-time, self-report method for<br />

eliciting affective responses to a computer system, toward<br />

creating <strong>an</strong> additional form of exch<strong>an</strong>ge between users <strong>an</strong>d<br />

designers. We are excited about the possibilities for the<br />

instrument in further enabling design teams to engage in<br />

productive <strong>an</strong>d reasonably-scaled user testing that improves<br />

the emotional experience for end users. We hope that other<br />

researchers <strong>an</strong>d designers in the CHI community will take<br />

up our offer of using the SEI in other cultural contexts, <strong>an</strong>d<br />

toward the development of systems, so that we c<strong>an</strong> further<br />

refine the instrument toward a final stage of having a ‘kit’<br />

that c<strong>an</strong> be used by <strong>an</strong>yone, <strong>an</strong>ywhere—a helpful addition<br />

to the r<strong>an</strong>ge of methods we have to measure <strong>an</strong>d underst<strong>an</strong>d<br />

affect in action.<br />

ACKNOWLEDGMENTS<br />

We th<strong>an</strong>k the Europe<strong>an</strong> Union for the funding of<br />

researchers dedicated to this project; we th<strong>an</strong>k Rainey<br />

Straus for creating the objects, <strong>an</strong>d for donating time to the<br />

project; <strong>an</strong>d we th<strong>an</strong>k those in the U.S. <strong>an</strong>d Sweden who<br />

participated in our studies. Th<strong>an</strong>ks also to those affective<br />

presence researchers who inform <strong>an</strong>d inspire our work:<br />

Phoebe Sengers, Bill Gaver, Michael Mateas, <strong>an</strong>d Geri Gay.<br />

REFERENCES<br />

1. Clynes, M. 1989. Sentics: The Touch of Emotions. MIT<br />

Press.<br />

2. Desmet, P.M.A. 2003. Measuring emotion: development<br />

<strong>an</strong>d application of <strong>an</strong> instrument to measure emotional<br />

re-sponses to products. In Blythe, MA, Overbeeke, K,<br />

Monk, AF & Wright, PC (Ed.), Funology: from<br />

usability to en-joyment. (Hum<strong>an</strong>-computer interaction<br />

series, 3, pp. 111-123). Dordrecht: Kluwer.<br />

3. DiPaulo, B.M., Rosenthal, R. 1979. The Structure of<br />

Nonverbal Decoding Skills. Journal of Personality<br />

47(3), 506-517.<br />

4. Djajadiningrat, W., Gaver, W., <strong>an</strong>d Fres, J.W. 2000.<br />

Interaction Relabelling <strong>an</strong>d Extreme Characters:<br />

Methods for Exploring Aesthetic Interactions.<br />

Proceedings of the conference on Designing interactive<br />

systems: processes, practices, methods, <strong>an</strong>d techniques,<br />

August 2000.<br />

5. Ekm<strong>an</strong>, P. 1972. Emotion in the Hum<strong>an</strong> face: Guidelines<br />

for Research <strong>an</strong>d <strong>an</strong> Integration of Findings,<br />

Pergammon Press.<br />

6. Fagerberg, P, Ståhl, A., <strong>an</strong>d Höök, K. 2004. eMoto –<br />

emotionally engaging interaction, Journal of Personal<br />

<strong>an</strong>d Ubiquitous Computing, 8(5), 377-381.<br />

7. Gaver, W., Beaver, J., Benford, S. 2003. Designing<br />

Design: Ambiguity as a Resource for Design.<br />

Proceedings of CHI 2003, 233-240.<br />

April 22-27, 2006 • Montréal, Québec, C<strong>an</strong>ada<br />

8. Gaver, W., Dunne, T., <strong>an</strong>d Pacenti, E. 1999. Design:<br />

Cultural Probes. Interactions 6(1), 21-29.<br />

9. Green, W.S. <strong>an</strong>d Jord<strong>an</strong>, P.M. 2002. Pleasure with<br />

Products: Beyond Usability. London: Taylor <strong>an</strong>d<br />

Fr<strong>an</strong>cis.<br />

10. <strong>Isbister</strong>, K., <strong>an</strong>d Höök, K. Evaluating <strong>Affective</strong><br />

Interfaces: Innovative Approaches. CHI 2005.<br />

http://www.sics.se/~kia/evaluating_affective_interfaces/<br />

11. <strong>Isbister</strong>, K., <strong>an</strong>d Nass, C. 2000. Consistency of<br />

personality in interactive characters: Verbal cues, nonverbal<br />

cues, <strong>an</strong>d user characteristics. International<br />

journal of hum<strong>an</strong>-computer studies 53(2), 251-267.<br />

12. Johnston, O. <strong>an</strong>d Thomas, F. 1995. The Illusion of Life:<br />

Disney Animation, Disney Editions.<br />

13. L<strong>an</strong>g, P.J., Bradley, M.M., & Cuthbert, B.N. 2005.<br />

Interational <strong>Affective</strong> Picture System (IAPS): Digitized<br />

Photographs, Instruction M<strong>an</strong>ual <strong>an</strong>d <strong>Affective</strong> Ratings.<br />

Technical Report A-6. Gainesville, FL. The Center for<br />

Research in Psychophysiology, University of Florida.<br />

14. LeDoux, J.E. 1996. The Emotional Brain. New York,<br />

Simon <strong>an</strong>d Schuster.<br />

15. Myers, David G. 2002. Intuition: Its Powers <strong>an</strong>d Perils.<br />

Yale University Press.<br />

16. Norm<strong>an</strong>, D.A. 2004. Emotional Design: Why We Love<br />

(or Hate) Everyday Things. New York: Basic Books.<br />

17. Pashler, H.E. 1998. Central Processing Limitations in<br />

Sen-sorimotor Tasks, Pp. 265-317 in The Psychology of<br />

Atten-tion, Cambridge, MA: MIT Press.<br />

18. Picard, R. 1997. <strong>Affective</strong> Computing. Boston, MA:<br />

MIT Press.<br />

19. Picard, R., <strong>an</strong>d Daily, S.B. 2005. Evaluating affective<br />

interactions: Alternatives to asking what users feel.<br />

Presented at the 2005 CHI Workshop ‘Evaluating<br />

<strong>Affective</strong> Interfaces’.<br />

20. Russell, J.A. 1980. A Circumplex Model of Affect,<br />

Journal of Personality <strong>an</strong>d Social Psychology 39(6), pp.<br />

1161-1178, Americ<strong>an</strong> Psychological Association.<br />

21. S<strong>an</strong>ders, L. (faculty web page)<br />

http://arts.osu.edu/2faculty/a_faculty_profiles/design_fa<br />

c_profiles/s<strong>an</strong>ders_liz.html.<br />

22. Wensveen, S.A.G. 1999. Probing experiences. Proceedings<br />

of the conference Design <strong>an</strong>d Emotion, November<br />

3 - 5 1999, Delft University of Technology, 23-29<br />

23. Wensveen, S.A.G., Overbeeke, C.J., & Djajadiningrat,<br />

J.P. 2000. Touch me, hit me <strong>an</strong>d I know how you feel. A<br />

design approach to emotionally rich interaction.<br />

Proceedings of DIS'00, Designing Interactive Systems.<br />

ACM, New York, 48-53.<br />

24. Wong, D., <strong>an</strong>d Baker, C. 1998. Pain in Children: Comparison<br />

of Assessment Scales, Pediatric Nurse 14(1),<br />

9017.<br />

1172

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!