08.10.2016 Views

Foundations of Data Science

2dLYwbK

2dLYwbK

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Lemma 3.10 (Analog <strong>of</strong> eigenvalues and eigenvectors)<br />

Av i = σ i u i and A T u i = σ i v i .<br />

Pro<strong>of</strong>: The first equation is already known. For the second, note that from the SVD, we<br />

get A T u i = ∑ j σ jv j u j T u i , where since the u j are orthonormal, all terms in the summation<br />

are zero except for j = i.<br />

3.7 Power Method for Computing the Singular Value Decomposition<br />

Computing the singular value decomposition is an important branch <strong>of</strong> numerical<br />

analysis in which there have been many sophisticated developments over a long period <strong>of</strong><br />

time. The reader is referred to numerical analysis texts for more details. Here we present<br />

an “in-principle” method to establish that the approximate SVD <strong>of</strong> a matrix A can be<br />

computed in polynomial time. The method we present, called the power method, is simple<br />

and is in fact the conceptual starting point for many algorithms. Let A be a matrix whose<br />

SVD is ∑ i σ iu i v T i . We wish to work with a matrix that is square and symmetric. Let<br />

B = A T A. By direct multiplication, using the orthogonality <strong>of</strong> the u i ’s that was proved<br />

in Theorem 3.7,<br />

( ) ( )<br />

∑ ∑<br />

B = A T A = σ i v i u T i σ j u j vj<br />

T<br />

i<br />

j<br />

= ∑ i,j<br />

σ i σ j v i (u T i · u j )v T j = ∑ i<br />

σ 2 i v i v T i .<br />

The matrix B is square and symmetric, and has the same left and right-singular vectors.<br />

In particular, Bv j = ( ∑ i σ2 i v i vi T )v j = σj 2 v j , so v j is an eigenvector <strong>of</strong> B with eigenvalue<br />

σj 2 . If A is itself square and symmetric, it will have the same right and left-singular vectors,<br />

namely A = ∑ σ i v i v T i and computing B is unnecessary.<br />

i<br />

Now consider computing B 2 .<br />

( ∑<br />

B 2 = σi 2 v i vi<br />

T<br />

i<br />

)<br />

) ( ∑<br />

j<br />

σ 2 j v j v T j<br />

= ∑ ij<br />

σ 2 i σ 2 j v i (v i T v j )v j<br />

T<br />

When i ≠ j, the dot product v i T v j is zero by orthogonality. 8<br />

computing the k th power <strong>of</strong> B, all the cross product terms are zero and<br />

B k =<br />

r∑<br />

i=1<br />

σ 2k<br />

i v i v i T .<br />

8 The “outer product” v i v j T is a matrix and is not zero even for i ≠ j.<br />

∑<br />

Thus, B 2 = r σi 4 v i v T i . In<br />

i=1<br />

49

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!