05.11.2012 Views

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

A.3. PAPER III 101<br />

yes<br />

Tbo > Tlim2<br />

Tbo=0<br />

no<br />

KF<br />

update<br />

filter<br />

initialization<br />

Tbo<br />

Fig. 4. The filter architecture.<br />

no<br />

∆ P< ∆ max<br />

yes<br />

no<br />

INS<br />

mech<strong>an</strong>ization<br />

Tbo > Tlim1<br />

yes<br />

yes<br />

KF<br />

prediction<br />

no<br />

new<br />

camera<br />

update<br />

sity <strong>an</strong>d the binarization threshold based on the intensity distribution of the<br />

pattern. The exposure controller controls the camera shutter time <strong>an</strong>d iris<br />

aiming at keeping the background intensity in a certain r<strong>an</strong>ge.<br />

3 Sensor Fusion<br />

The position <strong>an</strong>d attitude estimates provided by the vision system c<strong>an</strong> not<br />

be fed directly into the controller due to their intrinsic lack of robustness:<br />

the field of view c<strong>an</strong> be temporarily occluded (<strong>for</strong> example by the l<strong>an</strong>ding<br />

gear), the illumination conditions c<strong>an</strong> ch<strong>an</strong>ge dramatically just by moving<br />

few meters (sun reflections, shades, etc.). On the other h<strong>an</strong>d, vision readings<br />

are very accurate, when available.<br />

Hence, a navigation filter based on a Kalm<strong>an</strong> filter (KF) has been developed,<br />

fusing highly accurate 3D position estimates from the vision system<br />

with inertial data provided by the on-board accelerometers <strong>an</strong>d <strong>an</strong>gular rate<br />

gyros. Besides filtering out a large part of the noise <strong>an</strong>d outliers, the filters<br />

provides a satisfying dead reckoning capability, sufficient to complete the<br />

l<strong>an</strong>ding even when the vision system is ”blind” 1 , see Fig. 8.<br />

The implementation of the KF is done using the error state space or<br />

indirect <strong>for</strong>mulation with feedback mech<strong>an</strong>ization (Fig. 4). The states of<br />

the filter are the estimated inertial navigation system (INS) errors. The three<br />

observations are given by the difference between the INS position <strong>an</strong>d the<br />

position from the vision system (lateral, longitudinal <strong>an</strong>d vertical position<br />

relative to the pattern). The adv<strong>an</strong>tage of the indirect <strong>for</strong>mulation versus<br />

the direct <strong>for</strong>mulation (position, velocity <strong>an</strong>d attitude are among the state<br />

1 During the last 50 cm be<strong>for</strong>e touch down the vision system is often ”blind” due<br />

to two factors: (a) the shade of the helicopter covers part of the pattern at touch<br />

down, <strong>an</strong>d (b) when the dist<strong>an</strong>ce of the camera to the pattern is very small it is<br />

very hard <strong>for</strong> the controller of the p<strong>an</strong>/tilt unit to keep the pattern in the picture.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!