09.06.2013 Views

Gesture-Based Interaction with Time-of-Flight Cameras

Gesture-Based Interaction with Time-of-Flight Cameras

Gesture-Based Interaction with Time-of-Flight Cameras

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

CHAPTER 2. TIME-OF-FLIGHT CAMERAS<br />

camera.<br />

There exists, however, a simple approach to increase the non-ambiguity range.<br />

If two images are taken <strong>of</strong> the same scene <strong>with</strong> different modulation frequencies,<br />

one can reconstruct the distance unambiguously up to a range that corresponds to<br />

the least common multiple <strong>of</strong> the two corresponding non-ambiguity ranges. This<br />

procedure is limited to cases, where the objects undergo no or only limited motion<br />

in front <strong>of</strong> the camera. Otherwise, wrong distances will be estimated for pixels that<br />

move across object borders from one frame to the other and thus depict different<br />

objects at different distances in the two images.<br />

2.7.2 Systematic Errors<br />

As already mentioned in Section 2.6, TOF cameras can produce systematic errors, i.e.<br />

range measurement errors that are induced by deviations from an ideal sinusoidal<br />

signal e(t) <strong>of</strong> the illumination unit. Theoretically, these errors can be compensated<br />

by using a higher number <strong>of</strong> samples Ak to estimate the phase shift φ (Lange, 2000;<br />

Rapp, 2007). On the contrary, this is impractical as it would introduce higher com-<br />

putational cost and abet motion artefacts (Kolb et al., 2009).<br />

An alternative is to correct systematic errors in a post-processing step. This can,<br />

for example, be done using look-up tables (Kahlmann et al., 2007) or correction func-<br />

tions such as b-splines (Lindner and Kolb, 2006).<br />

2.7.3 Multiple Reflections<br />

The working principle <strong>of</strong> TOF cameras is based on the assumption that the modu-<br />

lated light that reaches a pixel is reflected from a single object at a well defined dis-<br />

tance and that the light travels directly back into the camera. In this case, the camera<br />

receives a single signal s(t) that has a phase shift φ <strong>with</strong> respect to the emitted signal<br />

e(t) that corresponds to the distance <strong>of</strong> the object. This assumption may be violated<br />

due to multiple reflections that can occur both in the scene and inside the camera.<br />

Multiple reflections in the scene refer to the situation when light <strong>of</strong> the active<br />

illumination travels not solely between the camera and the imaged object, but when<br />

it is reflected at additional objects in the scene along its path from the camera to the<br />

object and back to the sensor. In such a case, the distance travelled by the light is<br />

longer, which results in an increased time <strong>of</strong> flight and, thus, a false range estimate<br />

(Gudmundsson et al., 2007b).<br />

24<br />

Multiple reflections inside the camera have the effect that stray light from other

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!