12.07.2015 Views

v2007.09.17 - Convex Optimization

v2007.09.17 - Convex Optimization

v2007.09.17 - Convex Optimization

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

D.1. DIRECTIONAL DERIVATIVE, TAYLOR SERIES 565⎡∇ 2 g mn (X) =⎢⎣⎡∇ 2 g(X) T 1=⎢⎣∇ ∂gmn(X)∂X 11∇ ∂gmn(X)∂X 21.∇ ∂gmn(X)∂X K1∂∇g mn(X)∂X 11∂∇g mn(X)∂X 21⎤∇ ∂gmn(X)∂X 12· · · ∇ ∂gmn(X)∂X 1L∇ ∂gmn(X)∂X 22· · · ∇ ∂gmn(X)∂X 2L∈ R K×L×K×L⎥. . ⎦∇ ∂gmn(X)∂X K2· · · ∇ ∂gmn(X)∂X KL∂∇g mn(X)∂X 12· · ·∂∇g mn(X)∂X 1L∂∇g mn(X)∂X 2L⎤(1591)∂∇g mn(X)=∂X⎢22· · ·⎥⎣ . . . ⎦∂∇g mn(X) ∂∇g mn(X) ∂∇g∂X K1 ∂X K2· · · mn(X)∂X KLRotating our perspective, we get several views of the second-order gradient:⎡⎤∇ 2 g 11 (X) ∇ 2 g 12 (X) · · · ∇ 2 g 1N (X)∇ 2 ∇ 2 g 21 (X) ∇ 2 g 22 (X) · · · ∇ 2 g 2N (X)g(X) = ⎢⎥⎣ . .. ⎦ ∈ RM×N×K×L×K×L (1592)∇ 2 g M1 (X) ∇ 2 g M2 (X) · · · ∇ 2 g MN (X)⎡⎤∇ ∂g(X)∂X 11∇ ∂g(X)∂X 12· · · ∇ ∂g(X)∂X 1L⎡∇ 2 g(X) T 2=⎢⎣∇ ∂g(X)∂X 21.∇ ∂g(X)∂X K1∂∇g(X)∂X 11∂∇g(X)∂X 21.∂∇g(X)∂X K1∇ ∂g(X)∂X 22.· · · ∇ ∂g(X)∂X 2L.∇ ∂g(X)∂X K2· · · ∇ ∂g(X)∂∇g(X)∂X 12· · ·∂∇g(X)∂X 22.· · ·∂∇g(X)∂X K2· · ·∂X KL∂∇g(X)∂X 1L∂∇g(X)∂X 2L.∂∇g(X)∂X KL⎥⎦ ∈ RK×L×M×N×K×L (1593)⎤⎥⎦ ∈ RK×L×K×L×M×N (1594)Assuming the limits exist, we may state the partial derivative of the mn thentry of g with respect to the kl th and ij th entries of X ;∂ 2 g mn(X)∂X kl ∂X ij=g mn(X+∆t elimk e T l +∆τ e i eT j )−gmn(X+∆t e k eT l )−(gmn(X+∆τ e i eT )−gmn(X))j∆τ,∆t→0∆τ ∆t(1595)Differentiating (1575) and then scaling by Y ij∂ 2 g mn(X)∂X kl ∂X ijY kl Y ij = lim∆t→0∂g mn(X+∆t Y kl e k e T l )−∂gmn(X)∂X ijY∆tij(1596)g mn(X+∆t Y kl e= limk e T l +∆τ Y ij e i e T j )−gmn(X+∆t Y kl e k e T l )−(gmn(X+∆τ Y ij e i e T )−gmn(X))j∆τ,∆t→0∆τ ∆t

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!