14.09.2013 Views

Vision Based Hand Gesture Interfaces for Wearable Computing and ...

Vision Based Hand Gesture Interfaces for Wearable Computing and ...

Vision Based Hand Gesture Interfaces for Wearable Computing and ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 1. Introduction<br />

cannot be larger than the hardware, the input space observable with a camera is<br />

not limited by the camera’s <strong>for</strong>m factor. This property of “device-external inter-<br />

faces” is of increasing importance to ever-shrinking portable electronics. However,<br />

dealing with person-specific variations, cluttered backgrounds, changing lighting<br />

conditions, camera specifics, rapid relative motion, <strong>and</strong> hard real-time require-<br />

ments has so far prevented vision-based interfaces (VBI) from achieving robust-<br />

ness <strong>and</strong> usability in settings other than the lab environment. Recently, Sony’s<br />

EyeToy (a full-body VBI <strong>for</strong> the PlayStation2) <strong>and</strong> Canesta’s virtual keyboard<br />

(type on a keyboard projected onto any flat surface) have proven that consumer-<br />

grade applications have become feasible. Yet, they still are very limited in their<br />

interaction capabilities <strong>and</strong> the keyboard requires customized hardware.<br />

3

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!