05.11.2012 Views

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

98 APPENDIX A.<br />

Fig. 1. The WITAS helicopter descend- Fig. 2. L<strong>an</strong>ding pad with reference pating<br />

to the l<strong>an</strong>ding pad.<br />

tern seen from the on-board camera.<br />

proper trade-off between accuracy, r<strong>an</strong>ge, latency, <strong>an</strong>d rate has to be found<br />

optimizing the overall per<strong>for</strong>m<strong>an</strong>ce of the system.<br />

Our method requires a special l<strong>an</strong>ding pad (Fig. 2). As unm<strong>an</strong>ned helicopters<br />

usually operate from a designated home base this is not a real constraint.<br />

A precise <strong>an</strong>d fast p<strong>an</strong>/tilt camera is used to extend the r<strong>an</strong>ge of<br />

the vision system <strong>an</strong>d decouple the helicopter attitude from the vision field.<br />

We developed a single camera solution, as multi-camera systems make the<br />

system more complex <strong>an</strong>d expensive <strong>an</strong>d don’t offer signific<strong>an</strong>t adv<strong>an</strong>tages<br />

when using known l<strong>an</strong>dmarks. For the experimentation we used a helicopter<br />

plat<strong>for</strong>m (Fig. 1) which had been developed in the WITAS project [2,1].<br />

Vision-based control of small size autonomous helicopters is <strong>an</strong> active<br />

area of research. A good overview of the state-of-the-art c<strong>an</strong> be found in<br />

[7]. Our contribution to the l<strong>an</strong>ding problem consists of: (a) m<strong>an</strong>y demonstrated<br />

l<strong>an</strong>dings with a system only based on images from a single camera<br />

<strong>an</strong>d inertial data using off-the-shelf computer hardware; (b) a wide envelope<br />

of starting points <strong>for</strong> the autonomous approach; (c) robustness to different<br />

weather conditions (wind, ambient light); (d) a qu<strong>an</strong>titative evaluation of the<br />

vision system <strong>an</strong>d the l<strong>an</strong>ding per<strong>for</strong>m<strong>an</strong>ce.<br />

2 Vision System<br />

The vision system consists of a camera mounted on a p<strong>an</strong>/tilt unit (PTU),<br />

a computer <strong>for</strong> image processing, <strong>an</strong>d a l<strong>an</strong>ding pad (a foldable plate) with<br />

a reference pattern on its surface. In this section, we explain the design of<br />

the reference pattern, describe the image <strong>for</strong>mation, <strong>an</strong>d present the image<br />

processing algorithm.<br />

The reference pattern is designed to fulfill the following criteria: fast recognition,<br />

accurate pose estimation <strong>for</strong> close <strong>an</strong>d dist<strong>an</strong>t r<strong>an</strong>ge, minimum size,<br />

<strong>an</strong>d minimal asymmetry. We have chosen black circles on white background<br />

as they are fast to detect <strong>an</strong>d provide accurate image features (Fig. 2). From

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!