14.11.2012 Views

Submitted version of the thesis - Airlab, the Artificial Intelligence ...

Submitted version of the thesis - Airlab, the Artificial Intelligence ...

Submitted version of the thesis - Airlab, the Artificial Intelligence ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.4. Games and Interaction 15<br />

a 2.5 D visual control method. Deguchi et al. [22] proposed a decoupling<br />

method <strong>of</strong> translation and rotation. Camera calibration is a tedious task,<br />

and pre-calibration cameras used in visual control methods limit a lot <strong>the</strong><br />

flexibility <strong>of</strong> <strong>the</strong> system. Therefore, many researchers pursue <strong>the</strong> visual<br />

control methods with self-calibrated or un-calibrated cameras. Kragic et<br />

al. [34] gave an example to self-calibrate a camera with <strong>the</strong> image and <strong>the</strong><br />

CAD model <strong>of</strong> <strong>the</strong> object in <strong>the</strong>ir visual control system. Many researchers<br />

proposed various visual control methods with un-calibrated cameras, which<br />

belong to image based visual control methods. The camera parameters are<br />

not estimated individually, but combined into <strong>the</strong> image Jacobian matrix.<br />

For instance, Shen et al. [43] limited <strong>the</strong> working space <strong>of</strong> <strong>the</strong> end-effector<br />

on a plane vertical to <strong>the</strong> optical axis <strong>of</strong> <strong>the</strong> camera to eliminate <strong>the</strong> camera<br />

parameters in <strong>the</strong> image Jacobian matrix. Xu et al. [21] developed visual<br />

control method for <strong>the</strong> end-effector <strong>of</strong> <strong>the</strong> robot with two un-calibrated cameras,<br />

estimating <strong>the</strong> distances based on cross ratio invariance.<br />

2.4 Games and Interaction<br />

Advances in <strong>the</strong>technological medium<strong>of</strong> video games have recently included<br />

<strong>the</strong>deployment <strong>of</strong>physicalactivity-based controller technologies, suchas<strong>the</strong><br />

Wii [27], and vision-based controller systems, such as Intel’s Me2Cam [13].<br />

The rapid deployment <strong>of</strong> millions <strong>of</strong> iRobot Roomba home robots [14] and<br />

<strong>the</strong> great popularity <strong>of</strong> robotic play systems, such as LEGO Mindstorms<br />

and NXT [5] now present an opportunity to extend <strong>the</strong> realm <strong>of</strong> video game<br />

even fur<strong>the</strong>r, into physical environments, through <strong>the</strong> direct integration <strong>of</strong><br />

human-robot interaction techniques and architectures with video game experiences.<br />

Over <strong>the</strong> past thirty to forty years, a synergistic evolution <strong>of</strong> robotic<br />

and video game-like programming environments, such as Turtle Logo [36],<br />

has occurred. At <strong>the</strong> MIT Media Lab, <strong>the</strong>se platforms have been advanced<br />

through <strong>the</strong> constructionist pedagogies, research, and collaborations <strong>of</strong> Seymour<br />

Papert, Marvin Minsky, Mitch Resnick, and <strong>the</strong>ir colleagues, leading<br />

to Logo [7], Star Logo [37], programmable Crickets and Scratch [6] and Lego<br />

MindStorms [37]. In 2000, Kids Room [18] demonstrated that an immersive<br />

educational gaming environment with projected objects and characters in<br />

physical spaces (e.g., on <strong>the</strong> floor or walls), could involve children in highly<br />

interactive games, such as hide-and-seek. In 2004, RoBallet [20] advanced<br />

<strong>the</strong>se constructionist activities fur<strong>the</strong>r, blending elements <strong>of</strong> projected vir-

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!