Printed Program (pdf) - CHI 2012 - Association for Computing ...
Printed Program (pdf) - CHI 2012 - Association for Computing ...
Printed Program (pdf) - CHI 2012 - Association for Computing ...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Interactivity<br />
AHNE: A Novel Interface <strong>for</strong> Spatial Interaction i425<br />
Matti Niinimäki, Koray Tahiroglu, Aalto University, Finland<br />
In this paper we describe AHNE (Audio-Haptic Navigation<br />
Environment). It is a three-dimensional user interface (3D UI) <strong>for</strong><br />
manipulating virtual sound objects with natural gestures in a real<br />
environment. AHNE uses real-time motion tracking and custommade<br />
glove controllers as input devices, and auditory and haptic<br />
feedback as the output. We present the underlying system and a<br />
possible use <strong>for</strong> the interface as a musical controller.<br />
GraphTrail: Analyzing Large Multivariate,<br />
Heterogeneous Networks while Supporting<br />
Exploration History i427<br />
Cody Dunne, Nathalie Henry Riche, Bongshin Lee, Microsoft<br />
Research, UK<br />
Ronald Metoyer, Oregon State University, USA<br />
George Robertson, Microsoft Research, UK<br />
(See associated paper on page 68)<br />
QuickDraw: Improving Drawing Experience <strong>for</strong><br />
Geometric Diagrams i428<br />
Salman Cheema, University of Central Florida, USA<br />
Sumit Gulwani, Microsoft Research, USA<br />
Joseph LaViola, University of Central Florida, USA<br />
(See associated paper on page 49)<br />
A Handle Bar Metaphor <strong>for</strong> Virtual Object<br />
Manipulation with Mid-Air Interaction i429<br />
Peng Song, Wooi Boon Goh, William Hutama, Chi-Wing Fu,<br />
Xiaopei Liu, Nanyang Technological University, Singapore<br />
(See associated paper on page 56)<br />
DisplayStacks: Interaction Techniques <strong>for</strong> Stacks<br />
of Flexible Thin-Film Displays i430<br />
Aneesh Tarun, Queen’s University, Canada<br />
Audrey Girouard, Carleton University, Canada<br />
Roel Vertegaal, Queen’s University, Canada<br />
(See associated paper on page 81)<br />
Interactive Paper Substrates to Support Musical<br />
Creation i431<br />
Jérémie Garcia, Theophanis Tsandilas, INRIA, France<br />
Carlos Agon, IRCAM, France<br />
Wendy Mackay, INRIA, France<br />
(See associated paper on page 73)<br />
Discovery-based Games <strong>for</strong> Learning Software i432<br />
Tao Dong, University of Michigan, USA<br />
Mira Dontcheva, Diana Joseph, Adobe Systems, USA<br />
Karrie Karahalios, University of Illinois, USA<br />
Mark Newman, Mark Ackerman, University of Michigan, USA<br />
(See associated paper on page 79)<br />
114 | ACM Conference on Human Factors in <strong>Computing</strong> Systems<br />
ZeroTouch: An Optical Multi-Touch and Free-Air<br />
Interaction Architecture i433<br />
Jonathan Moeller, Andruid Kerne, William Hamilton,<br />
Andrew Webb, Nicholas Lupfer, Texas A&M University, USA<br />
(See associated paper on page 76)<br />
FlexCam – Using Thin-film Flexible OLED Color<br />
Prints as a Camera Array i434<br />
Connor Dickie, Nicholas Fellion, Roel Vertegaal, Queen’s<br />
University, Canada<br />
FlexCam is a novel compound camera plat<strong>for</strong>m that explores<br />
interactions with color photographic prints using thinfilm flexible<br />
color displays. FlexCam augments a thinfilm color Flexible Organic<br />
Light Emitting Diode (FOLED) photographic viewfinder display<br />
with an array of lenses at the back. Our prototype allows <strong>for</strong> the<br />
photograph to act as a camera, exploiting flexibility of the<br />
viewfinder as a means to dynamically re-configure images<br />
captured by the photograph. FlexCam’s flexible camera array has<br />
altered optical characteristics when flexed, allowing users to<br />
dynamically expand and contract the camera’s field of view (FOV).<br />
Integrated bend sensors measure the amount of flexion in the<br />
display. The degree of flexion is used as input to software, which<br />
dynamically stitches images from the camera array and adjusts<br />
viewfinder size to reflect the virtual camera’s FOV. Our prototype<br />
envisions the use of photographs as cameras in one aggregate<br />
flexible, thin-film device.<br />
Toolset to explore visual motion designs in a<br />
video game i435<br />
David Milam, School of Interactive Arts and Technology, Canada<br />
Magy Seif El-Nasr, Northeastern University, USA<br />
Lyn Bartram, Matt Lockyer, Chao Feng, Perry Tan, School of<br />
Interactive Arts and Technology, Canada<br />
We describe a research toolset to explore visual designs in a video<br />
game. We focus specifically on visual motion, defined by attributes<br />
of motion, and their effect on accessibility, which may lead to a<br />
diminished experience <strong>for</strong> novice players. Eight expert game<br />
designers evaluated the tool embedded into a simple point and<br />
click game. Specifically they controlled attributes of speed, size of<br />
game elements, and amount of elements on screen associated to<br />
game targets, distractions, and feedback. The tool allowed<br />
experts to define difficulty settings and expose patterns, which<br />
they verified. As a game, we then investigated the effect of visual<br />
motion on accessibility in a <strong>for</strong>mal user study comprised of 105<br />
participants. As a follow-up to this work, we expanded the toolset<br />
to include 8 additional attributes of motion.<br />
iRotate: Automatic Screen Rotation based on<br />
Face Orientation i437<br />
Lung-Pan Cheng, Fang-I Hsiao, Yen-Ting Liu, Mike Y. Chen,<br />
National Taiwan University, Taiwan<br />
(See associated paper on page 76)