23.11.2012 Views

Master Thesis - Hochschule Bonn-Rhein-Sieg

Master Thesis - Hochschule Bonn-Rhein-Sieg

Master Thesis - Hochschule Bonn-Rhein-Sieg

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4. Overall evading concept <strong>Master</strong> <strong>Thesis</strong> Björn Ostermann page 49 of 126<br />

To record the points, similar to the recording of the background image in chapter 4.3.2, 500 sample<br />

data are taken by the 3D camera. The medians of each targeted pixel is computed and stored. From<br />

those values and the respective values from the robot’s coordinate frame, the transformation matrix<br />

between the camera’s and the robot’s frame is computed, using the algorithm described in chapter<br />

5.5.2.<br />

As with the background, the voxel data acquired in this part can be stored to disk and retrieved later.<br />

This shortens the start up time of the system. Only the voxel positions have to be stored, the<br />

transformation matrix can be easily recomputed after the retrieval of their values in the set up phase.<br />

4.4 Path planning control<br />

The path planning control thread has to compute the solutions described in chapter 2.3. That means it<br />

has to find a path for the robot, on which the working robot avoids the collision with intruding objects<br />

and is still able to achieve its desired goal as far as possible. The primary goal in this task is the<br />

avoidance of collisions.<br />

As shown in Figure 18, the path planning control thread acquires pre-processed data from the camera<br />

control thread. This data contains among others the distance between the robot and objects intruding in<br />

the workspace and the outline of the reachable space.<br />

The path planning control thread combines those data with a set of defined goal positions the robot has<br />

to drive through while working. In this project, to demonstrate the path planning and approach<br />

detection, the robot was given four goal positions that result in a rectangle in front of the robot, by<br />

which it attempts to drive. From the combination of the data the thread calculates the next pose<br />

(position and orientation) the robot has to reach respectively the speed in which the robot can do this,<br />

dependant on the chosen approach (see chapter 4.3.4 – reachable space or distance measurement). The<br />

result is transmitted to the robot control thread (see chapter 4.5) via the data exchange object.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!