08.10.2016 Views

Foundations of Data Science

2dLYwbK

2dLYwbK

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Exercise 3.18 Modify the power method to find the first four singular vectors <strong>of</strong> a matrix<br />

A as follows. Randomly select four vectors and find an orthonormal basis for the space<br />

spanned by the four vectors. Then multiple each <strong>of</strong> the basis vectors times A and find a<br />

new orthonormal basis for the space spanned by the resulting four vectors. Apply your<br />

method to find the first four singular vectors <strong>of</strong> matrix A <strong>of</strong> Exercise 3.17. In Matlab the<br />

command orth finds an orthonormal basis for the space spanned by a set <strong>of</strong> vectors.<br />

Exercise 3.19 A matrix M is positive semi-definite if for all x, x T Mx ≥ 0.<br />

1. Let A be a real valued matrix. Prove that B = AA T is positive semi-definite.<br />

2. Let A be the adjacency matrix <strong>of</strong> a graph. The Laplacian <strong>of</strong> A is L = D − A where<br />

D is a diagonal matrix whose diagonal entries are the row sums <strong>of</strong> A. Prove that<br />

L is positive semi definite by showing that L = B T B where B is an m-by-n matrix<br />

with a row for each edge in the graph, a column for each vertex, and we define<br />

⎧<br />

⎨<br />

b ei =<br />

⎩<br />

−1 if i is the endpoint <strong>of</strong> e with lesser index<br />

1 if i is the endpoint <strong>of</strong> e with greater index<br />

0 if i is not an endpoint <strong>of</strong> e<br />

Exercise 3.20 Prove that the eigenvalues <strong>of</strong> a symmetric real valued matrix are real.<br />

Exercise 3.21 Suppose A is a square invertible matrix and the SVD <strong>of</strong> A is A = ∑ i<br />

Prove that the inverse <strong>of</strong> A is ∑ i<br />

1<br />

σ i<br />

v i u T i .<br />

σ i u i v T i .<br />

i=1<br />

i=1<br />

Exercise 3.22 Suppose A is square, but not necessarily invertible and has SVD A =<br />

r∑<br />

∑<br />

σ i u i vi T . Let B =<br />

r 1<br />

σ i<br />

v i u T i . Show that BAx = x for all x in the span <strong>of</strong> the rightsingular<br />

vectors <strong>of</strong> A. For this reason B is sometimes called the pseudo inverse <strong>of</strong> A and<br />

can play the role <strong>of</strong> A −1 in many applications.<br />

Exercise 3.23<br />

1. For any matrix A, show that σ k ≤ ||A|| F √<br />

k<br />

.<br />

2. Prove that there exists a matrix B <strong>of</strong> rank at most k such that ||A − B|| 2 ≤ ||A|| F √<br />

k<br />

.<br />

3. Can the 2-norm on the left hand side in (b) be replaced by Frobenius norm?<br />

Exercise 3.24 Suppose an n × d matrix A is given and you are allowed to preprocess<br />

A. Then you are given a number <strong>of</strong> d-dimensional vectors x 1 , x 2 , . . . , x m and for each <strong>of</strong><br />

these vectors you must find the vector Ax j approximately, in the sense that you must find a<br />

vector y j satisfying |y j − Ax j | ≤ ε||A|| F |x j |. Here ε >0 is a given error bound. Describe<br />

an algorithm that accomplishes this in time O ( )<br />

d+n<br />

ε per 2 xj not counting the preprocessing<br />

time. Hint: use Exercise 3.23.<br />

67

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!