27.12.2012 Views

Printed Program (pdf) - CHI 2012 - Association for Computing ...

Printed Program (pdf) - CHI 2012 - Association for Computing ...

Printed Program (pdf) - CHI 2012 - Association for Computing ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Interactivity<br />

ShoeSense: A New Perspective on Hand<br />

Gestures and Wearable Applications i406<br />

Gilles Bailly, Jörg Müller, Technische Universität, Germany<br />

Michael Rohs, University of Munich, Germany<br />

Daniel Wigdor, University of Toronto, Canada<br />

Sven Kratz, University of Munich, Germany<br />

Dennis Guse, Technische Universität, Germany<br />

When the user is engaged with a real-world task it can be<br />

inappropriate or difficult to use a smartphone. To address this<br />

concern, we developed ShoeSense, a wearable system consisting in<br />

part of a shoe-mounted depth sensor pointing upward at the wearer.<br />

ShoeSense recognizes relaxed and discreet as well as large and<br />

demonstrative hand gestures. In particular, we designed three gesture<br />

sets (Triangle, Radial, and Finger-Count) <strong>for</strong> this setup, which can be<br />

per<strong>for</strong>med without visual attention. The advantages of ShoeSense are<br />

illustrated in five scenarios: (1) quickly per<strong>for</strong>ming frequent operations<br />

without reaching <strong>for</strong> the phone, (2) discreetly per<strong>for</strong>ming operations<br />

without disturbing others, (3) enhancing operations on mobile devices,<br />

(4) supporting accessibility, and (5) artistic per<strong>for</strong>mances. We present a<br />

proof-of-concept, wearable implementation based on a depth camera<br />

and report on a lab study comparing social acceptability, physical and<br />

mental demand, and user preference. A second study demonstrates a<br />

94-99% recognition rate of our recognizers.<br />

Mobile ActDresses: <strong>Program</strong>ming Mobile Devices<br />

by Accessorizing i407<br />

Mattias Jacobsson, Ylva Fernaeus, Stina Nylander, Swedish<br />

Institute of Computer Science, Sweden<br />

Mobile ActDresses is a design concept where existing practices of<br />

accessorizing, customization and manipulation of a physical<br />

mobile device is coupled with the behaviour of its software. With<br />

this interactivity demonstrator we will provide a hands on<br />

experience of doing this kind of playful manipulation. We provide<br />

two examples <strong>for</strong> how to implement Mobile ActDresses using<br />

quick’n dirty hacks to create custom shells and jewellery <strong>for</strong><br />

controlling the behaviour of the phone.<br />

AMARA: The Affective Museum of Art Resource<br />

Agent i408<br />

S. Joon Park, Drexel University, USA<br />

Gunho Chae, Korea Advanced Institute of Science and<br />

Technology, Republic of Korea<br />

Craig MacDonald, Drexel University, USA<br />

Robert Stein, The Indianapolis Museum of Art, USA<br />

Susan Wiedenbeck, Drexel University, USA<br />

Jungwha Kim, Korea Advanced Institute of Science and<br />

Technology, Republic of Korea<br />

This interactive system uses an embedded agent <strong>for</strong> question-based<br />

art collection search on the plat<strong>for</strong>m of the Indianapolis Museum of<br />

Art website. Unlike a keyword search box, AMARA helps users browse<br />

and search <strong>for</strong> artwork by asking them simple questions with answers<br />

mapped to social tags. Thus, the users do not need to be subject<br />

matter experts to input specific terms to search. In designing AMARA,<br />

we focused on creating an enjoyable browsing experience and<br />

helping users to determine their known and unknown art preferences.<br />

112 | ACM Conference on Human Factors in <strong>Computing</strong> Systems<br />

Design of an Exergaming Station <strong>for</strong> Children<br />

with Cerebral Palsy i409<br />

Hamilton Hernandez, Nicholas Graham, Darcy Fehlings,<br />

Lauren Switzer, Zi Ye, Quentin Bellay, Md Ameer Hamza,<br />

Cheryl Savery, Tadeusz Stach, Queen’s University, Canada<br />

(See associated paper on page 92)<br />

Scoop! A Movement-based Math Game Designed<br />

to Reduce Math Anxiety i410<br />

Katherine Isbister, NYU-Poly<br />

Mike Karlesky, Polytechnic Institute of NYU, USA<br />

Jonathan Frye, New York University, USA<br />

Rahul Rao, Polytechnic Institute of NYU, USA<br />

In this paper, we describe Scoop!, a movement-based game<br />

designed to reduce math anxiety. The game makes use of<br />

research on the effects of ‘power poses’ to explore whether<br />

movement mechanics can shift feelings about math <strong>for</strong> players.<br />

The Interactivity demonstration includes both a ‘high power’,<br />

Kinect-driven version of the game, and a ‘low power’, track-paddriven<br />

version of the game. <strong>CHI</strong> attendees can try out both<br />

versions to physically experience the effects.<br />

EyeRing: An Eye on a Finger i411<br />

Suranga Nanayakkara, Singapore University of Technology and<br />

Design, Singapore<br />

Roy Shilkrot, Pattie Maes, Massachusetts Institute of Technology,<br />

USA<br />

Finger-worn devices are a greatly underutilized <strong>for</strong>m of interaction<br />

with the surrounding world. By putting a camera on a finger we<br />

show that many visual analysis applications, <strong>for</strong> visually impaired<br />

people as well as the sighted, prove seamless and easy. We<br />

present EyeRing, a ring mounted camera, to enable applications<br />

such as identifying currency and navigating, as well as helping<br />

sighted people to tour an unknown city or intuitively translate<br />

signage. The ring apparatus is autonomous, however our system<br />

also includes a mobile phone or computation device to which it<br />

connects wirelessly, and an earpiece <strong>for</strong> in<strong>for</strong>mation retrieval.<br />

Finally, we will discuss how different finger worn sensors may be<br />

extended and applied to other domains.<br />

IllumiShare: Sharing Any Surface i412<br />

Sasa Junuzovic, Kori Inkpen, Tom Blank, Anoop Gupta, Microsoft<br />

Research, UK<br />

(See associated paper on page 71)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!