06.01.2015 Views

Where am I? Sensors and Methods for Mobile Robot Positioning

Where am I? Sensors and Methods for Mobile Robot Positioning

Where am I? Sensors and Methods for Mobile Robot Positioning

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

110 Part I <strong>Sensors</strong> <strong>for</strong> <strong>Mobile</strong> <strong>Robot</strong> <strong>Positioning</strong><br />

Cone shaped object<br />

Laser<br />

Range gate<br />

Timing generator<br />

CCD array<br />

Figure 4.18: Simplified block diagr<strong>am</strong> of a three-c<strong>am</strong>era configuration of the LORDS 3-D laser TOF<br />

rangefinding system. (Courtesy of <strong>Robot</strong>ics Vision Systems, Inc.)<br />

function will be per<strong>for</strong>med by microchannel plate image intensifiers (MCPs) 18 or 25 millimeters in<br />

size, which will be gated in a binary encoding sequence, effectively turning the CCDs on <strong>and</strong> off<br />

during the detection phase. Control of the system will be h<strong>and</strong>led by a single-board processor based<br />

on the Motorola MC-68040.<br />

LORDS obtains three-dimensional image in<strong>for</strong>mation in real time by employing a novel time-offlight<br />

technique requiring only a single laser pulse to collect all the in<strong>for</strong>mation <strong>for</strong> an entire scene.<br />

The emitted pulse journeys a finite distance over time; hence, light traveling <strong>for</strong> 2 milliseconds will<br />

illuminate a scene further away than light traveling only 1 millisecond.<br />

The entire sensing range is divided into discrete distance increments, each representing a distinct<br />

range plane. This is accomplished by simultaneously gating the MCPs of the observation c<strong>am</strong>eras<br />

according to their own unique on-off encoding pattern over the duration of the detection phase. This<br />

binary gating alternately blocks <strong>and</strong> passes any returning reflection of the laser emission off objects<br />

within the field-of-view. When the gating cycles of each c<strong>am</strong>era are lined up <strong>and</strong> compared, there<br />

exists a uniquely coded correspondence which can be used to calculate the range to any pixel in the<br />

scene.<br />

Transmitted pulse<br />

Schematic of portion<br />

Illuminated vs time<br />

Object to lens delay<br />

Schematic of portion<br />

received vs time<br />

(delayed)<br />

Range gate 1 (A)<br />

Range gate 2 (B)<br />

Range gate 3 (C)<br />

765432 1<br />

Figure 4.19: Range <strong>am</strong>biguity is reduced by increasing the number of binary range gates. (Courtesy of<br />

<strong>Robot</strong>ic Vision Systems, Inc.)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!