05.11.2012 Views

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

Navigation Functionalities for an Autonomous UAV Helicopter

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

58 CHAPTER 5. SENSOR FUSION FOR VISION BASED LANDING<br />

competences in several disciplines such as image processing, sensor fusion<br />

<strong>an</strong>d control. Paper III describes the approach <strong>an</strong>d solution to the complete<br />

problem. This chapter focuses on the sensor fusion problem involved in the<br />

vision based autonomous l<strong>an</strong>ding mode. Details of the image processing<br />

<strong>an</strong>d control strategy are not described here. The reader interested in the<br />

details of these problems should read Paper III in the appendix of this<br />

thesis.<br />

The motivations <strong>for</strong> the development of a vision based l<strong>an</strong>ding mode are<br />

of two categories: scientific <strong>an</strong>d technical. The scientific motivation is that a<br />

helicopter which does not rely on external sources of in<strong>for</strong>mation (like GPS)<br />

contributes to the scientific goal of a self-sufficient autonomous system.<br />

The technical motivation is that GPS technology is generally not robust<br />

while operating close to obstacles. For example, in <strong>an</strong> urb<strong>an</strong> environment<br />

the GPS signal c<strong>an</strong> be obscured by buildings or corrupted by multi path<br />

reflections or nearby radio frequency tr<strong>an</strong>smitters. The l<strong>an</strong>ding approach<br />

proposed in Paper III is completely independent of a GPS, so it c<strong>an</strong> be<br />

used <strong>for</strong> l<strong>an</strong>ding the helicopter in proximity of obstacles found in urb<strong>an</strong><br />

environments.<br />

In order to stabilize <strong>an</strong>d control a <strong>UAV</strong> helicopter, <strong>an</strong> accurate <strong>an</strong>d<br />

reliable state estimation is required. The st<strong>an</strong>dard strategy to solve this<br />

problem is to use several sensors with different characteristics, such as<br />

inertial sensors <strong>an</strong>d GPS, <strong>an</strong>d fuse them together using a Kalm<strong>an</strong> filter.<br />

The integration between inertial sensors <strong>an</strong>d GPS is a common practice<br />

<strong>an</strong>d <strong>an</strong> extensive literature on this topic is available. Several approaches to<br />

this problem c<strong>an</strong> be found in [13, 25, 23].<br />

The method used here to fuse vision data with inertial sensors is similar<br />

to that used <strong>for</strong> GPS <strong>an</strong>d inertial sensor integration with a number of<br />

differences in the implementation. The great experience gained in m<strong>an</strong>y<br />

successful experimental l<strong>an</strong>dings with our RMAX plat<strong>for</strong>m provides strong<br />

confirmation that the same sensor integration technique used <strong>for</strong> GPS <strong>an</strong>d<br />

inertial sensors c<strong>an</strong> be used when the GPS is replaced with a suitable image<br />

processing system. The vision based l<strong>an</strong>ding problem <strong>for</strong> <strong>an</strong> unm<strong>an</strong>ned<br />

helicopter has been addressed by other research groups, some related work<br />

on this problem c<strong>an</strong> be found in Paper III.<br />

As already mentioned, the l<strong>an</strong>ding problem is solved by using a single<br />

camera mounted on a p<strong>an</strong>-tilt unit <strong>an</strong>d <strong>an</strong> inertial measurement unit (IMU)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!