13.08.2018 Views

[Studies in Computational Intelligence 481] Artur Babiarz, Robert Bieda, Karol Jędrasiak, Aleksander Nawrat (auth.), Aleksander Nawrat, Zygmunt Kuś (eds.) - Vision Based Systemsfor UAV Applications (2013, Sprin

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Recognition and Location of Objects <strong>in</strong> the Visual Field of a <strong>UAV</strong> <strong>Vision</strong> System 33<br />

3 The Algorithm of the Objects Classification<br />

For the purpose of the realization of the recognition of the objects, a perceptron<br />

algorithm, a simple algorithm of a l<strong>in</strong>ear classifier was used. This method allows<br />

for the solution of the hyperplane designation problem that exists <strong>in</strong> the l – dimensional<br />

features’ space of the objects divid<strong>in</strong>g the given space <strong>in</strong>to two dimensions.<br />

The determ<strong>in</strong>ed hyperplane, at the same time, divides features vectors belong<strong>in</strong>g<br />

to two different objects classes. In the perceptron algorithm it is assumed that two<br />

classes and l<strong>in</strong>early separated exist. Therefore it is possible to determ<strong>in</strong>e<br />

such a hyperplane 0 where:<br />

*<br />

w T x > 0 ∀x<br />

∈ω<br />

(16.1)<br />

*<br />

1<br />

w T x < 0 ∀x<br />

∈ω<br />

(16.2)<br />

The case described by the def<strong>in</strong>ition is the one where the hyperplane passes by the<br />

beg<strong>in</strong>n<strong>in</strong>g of the coord<strong>in</strong>ates system connected with the space features. In more<br />

general case the hyperplane equation is 0. Tak<strong>in</strong>g <strong>in</strong>to account how<br />

to write the equation for the def<strong>in</strong>ition (16) the features space should be extended<br />

with an extra dimension. As a result, <strong>in</strong> 1 - of the dimensional space the<br />

features vector is def<strong>in</strong>ed as follows: ,1 , where the vector of the hyperplane<br />

coefficients: , . Then . The problem of<br />

discover<strong>in</strong>g the coefficients vector w described <strong>in</strong> the hyperplane is brought down<br />

to the classical optimization problem. In the perceptrone algorithm a follow<strong>in</strong>g<br />

cost function is m<strong>in</strong>imized:<br />

,<br />

<br />

where is a subset of the tra<strong>in</strong><strong>in</strong>g feature vectors of the two classes , that were<br />

<strong>in</strong>correctly classified by the weight vector . Parameter takes the value -<br />

1if and 1 where . With the above assumptions and def<strong>in</strong>itions<br />

it can be easily shown that the rate of (17) shall always take the positive or zero<br />

values when the set is empty (all tra<strong>in</strong><strong>in</strong>g vectors are correctly classified).<br />

It can be shown that this optimization problem is resolvable us<strong>in</strong>g the follow<strong>in</strong>g<br />

iterative rule:<br />

1 <br />

with a proper selection of the parameter values the tra<strong>in</strong><strong>in</strong>g algorithm (18)<br />

converges the m<strong>in</strong>imiz<strong>in</strong>g quality <strong>in</strong>dex solution (17).<br />

2<br />

<br />

<br />

(17)<br />

(18)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!