05.06.2013 Views

PNNL-13501 - Pacific Northwest National Laboratory

PNNL-13501 - Pacific Northwest National Laboratory

PNNL-13501 - Pacific Northwest National Laboratory

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

and interactions. The flexibility of the environment<br />

enables any user to add, delete, or modify objects in real<br />

time, within the environment.<br />

This dynamic architecture naturally supports the creation<br />

of dynamic avatars. Each avatar consists of a number of<br />

independent dynamic objects (limbs) that are connected at<br />

user-defined points (joints). The objects communicate<br />

their relative positions, configurations, etc., to each other,<br />

so that, for instance, rotating the torso of the object<br />

automatically rotates the legs and arms, but the legs can<br />

be raised independently of each other. In this way, the<br />

user can posture the avatar in real time to create unique<br />

and distinctive mannerisms. As people are very good at<br />

learning and identifying people by their behavioral<br />

mannerisms, this will enable others to trust that the user is<br />

who it appears to be.<br />

Usability evaluations conducted midway through this<br />

project indicated that the dynamic avatars had significant<br />

potential as collaborative tools by facilitating nonverbal<br />

communication. However, these studies also indicated<br />

the serious deficits of the mouse-and-keyboard control<br />

mechanism. The second year of this project, therefore,<br />

was focused primarily on developing alternative<br />

mechanisms for controlling position and motion in six<br />

dimensions. Three approaches were studied: the<br />

traditional mouse/keyboard combination, a six-degree-offreedom<br />

controller referred to as the Spaceball, and the<br />

<strong>PNNL</strong>-developed gesture-based controller known as the<br />

Human-Information Workspace (HI-space). Pros and<br />

cons of each of these interaction tools were identified, and<br />

a potential solution described with a prototype developed.<br />

Results and Accomplishments<br />

Human-form avatars were created; each with 14<br />

independently flexible joints, including the neck,<br />

shoulders, elbows, wrists, waist, hips, knees, and ankles.<br />

The joints can be constrained to meet natural human<br />

limitations (the head cannot rotate down through the<br />

chest, for instance) or left unconstrained, at the user’s<br />

choice. Figure 1 illustrates the avatars’ posture and<br />

motion flexibility.<br />

The second phase of this project explored the utility of<br />

various interaction mechanisms beyond the mouse and<br />

keyboard, which were deemed inadequate in usability<br />

testing. Initial exploration was done using a Spaceball.<br />

The Spaceball consists of a sensor ball and an array of<br />

buttons. Sensors inside the ball detect pressure: push,<br />

pull, or twist, and use that information to move the<br />

appropriate object in the virtual space. The buttons can<br />

Figure 1. Multi-jointed avatars “dancing”<br />

be mapped to task appropriate commands. For this effort,<br />

the buttons were mapped to the various limbs of the<br />

avatar: arms and legs. For example, to use the Spaceball<br />

controller, the user pressed a button to select for the upper<br />

arm and then pushed on the ball to contort the arm as<br />

appropriate.<br />

The Spaceball controller provides one valid mechanism<br />

for controlling the avatars in the virtual world. However,<br />

like the mouse, it is still inherently limited to moving a<br />

single limb or object at a time. Further, the mapping of<br />

limbs to buttons must be hard-coded and cannot be<br />

dynamically adjusted. This lack of flexibility suggested<br />

that continued research needed to be done to explore<br />

alternative control mechanisms. Accordingly, the final<br />

phase of this project explored incorporating into this<br />

effort other ongoing research into next-generation<br />

interfaces. This effort uses the HI-space to posture and<br />

move the avatars. The prototype controller, shown in<br />

Figure 2, uses a video camera mounted over a table-sized<br />

surface to track the user’s gestures. Selection of a body<br />

part to move is done by extending the forefinger and<br />

Figure 2. The HI-space being used to control an avatar<br />

Computer Science and Information Technology 165

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!