14.11.2012 Views

Conformal Geometric Algebra in Stochastic Optimization Problems ...

Conformal Geometric Algebra in Stochastic Optimization Problems ...

Conformal Geometric Algebra in Stochastic Optimization Problems ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

6.2. ERROR PROPAGATION WITH CGA 185<br />

Now that the expectation E(h ∽ ) of a random vector with functional dependence<br />

h(x,y) can be evaluated it rema<strong>in</strong>s to look for an appropriate expression for the<br />

covariance matrix of h ∽ . Start<strong>in</strong>g aga<strong>in</strong> from h(x ∽ ∈ � s ) ∈ � equation (6.2) can be<br />

utilized <strong>in</strong><br />

Cov(X ∽ , Y ∽ ) = E(X ∽ Y ∽ ) − E(X ∽ )E(Y ∽ ) (6.6)<br />

to derive the familiar expression<br />

Σzz ≈ Jh,x(x) Σxx Jh,x(x) T , (6.7)<br />

where z ∽ := h(x ∽ ) was used. The Jacobian Jh,x(x) is def<strong>in</strong>ed as<br />

� �<br />

Jh,x(x)<br />

ij<br />

:= ∂<br />

hi(x).<br />

∂xj<br />

At this po<strong>in</strong>t the splitt<strong>in</strong>g (6.3) can be reused, i.e. substitut<strong>in</strong>g x � [x;y] yields the<br />

covariance matrix for z ∽ := h(x ∽ , y ∽ )<br />

Σzz ≈<br />

�<br />

�<br />

Jh,x(x, y) Jh,y(x, y)<br />

� Σxx Σxy<br />

Σyx Σyy<br />

�� Jh,x(x, y) T<br />

Jh,y(x, y) T<br />

�<br />

(6.8)<br />

On multiply<strong>in</strong>g out this result it can be seen that each term is a variant of the<br />

l<strong>in</strong>ear equivalent (5.10).<br />

For the concerns of this thesis the accuracy of equation (6.8) regard<strong>in</strong>g bil<strong>in</strong>ear<br />

functions must still be <strong>in</strong>vestigated. As it was resorted to a second order Taylor<br />

expansion one could expect that error propagation of covariances is exact <strong>in</strong> the<br />

bil<strong>in</strong>ear case. But it has to be taken <strong>in</strong>to account that because of equation (6.6)<br />

terms like E(ha ) may occur, that is <strong>in</strong> the notation (6.5)<br />

∽ hb<br />

∽<br />

E(h a<br />

∽ hb<br />

∽<br />

�<br />

) ≈ E x<br />

∽<br />

i1 i2 x∽ yj1 yj2 a<br />

G i1j1 ∽ ∽<br />

Gbi2j2 Correspond<strong>in</strong>gly, second derivatives might no longer be adequate - <strong>in</strong>stead a Taylor<br />

approximation of order four would be necessary. This is gone through <strong>in</strong> [93, 96]<br />

with the result that an additional bias term is to be added to equation (6.8) so as<br />

to reach exactness1 . Here a different method is chosen to provide the bias term: it<br />

is proven <strong>in</strong> [74], page 134, that the covariance of two symmetric quadratic forms<br />

vTAv∽<br />

and vTBv∽ , respectively, is given by<br />

∽ ∽<br />

T T<br />

Cov(v Av∽ , v Bv∽ ) = 4v<br />

∽ ∽<br />

T AΣvvBv + 2tr(AΣvvBΣvv) (6.9)<br />

where v ∽ ∈ � v denotes a normally distributed random vector with expectation<br />

v := E(v ∽ ) and covariance matrix Σvv = Cov(v ∽ , v ∽ ), as usual. Now let w.l.o.g.<br />

v<br />

∽ = [x ∽ ; y], A :=<br />

∽ 1<br />

2<br />

1 Assum<strong>in</strong>g normality for x∽ and y ∽ .<br />

� 0 G a<br />

G aT<br />

0<br />

�<br />

�<br />

.<br />

and B := 1<br />

2<br />

� 0 G b<br />

G bT<br />

0<br />

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!