06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

302 INNER PRODUCT SPACES<br />

This is called a minimizing sequence. Show there exists a unique k ∈ K such that<br />

lim n→∞ |k n − k| and that k = Px. That is, there exists a well defined projection map<br />

onto the convex subset of H. Hint: Use the parallelogram identity in an auspicious<br />

manner to show {k n } is a Cauchy sequence which must therefore converge. Since K<br />

is closed it follows this will converge to something in K which is the desired vector.<br />

19. Let H be an inner product space which is also complete and let P denote the projection<br />

map onto a convex closed subset, K. Show this projection map is characterized by<br />

the inequality<br />

Re (k − Px,x− Px) ≤ 0<br />

for all k ∈ K. That is, a point z ∈ K equals Px if and only if the above variational<br />

inequality holds. This is what that inequality is called. This is because k is allowed<br />

to vary and the inequality continues to hold for all k ∈ K.<br />

20. Using Problem 19 and Problems 17 - 18 show the projection map, P ontoaclosed<br />

convex subset is Lipschitz continuous with Lipschitz constant 1. That is<br />

|Px− Py|≤|x − y|<br />

21. Give an example of two vectors in R 4 x, y and a subspace V such that x · y = 0 but<br />

P x·P y ≠0whereP denotes the projection map which sends x to its closest point on<br />

V .<br />

22. Suppose you are given the data, (1, 2) , (2, 4) , (3, 8) , (0, 0) . Find the linear regression<br />

line using the formulas derived above. Then graph the given data along with your<br />

regression line.<br />

23. Generalize the least squares procedure to the situation in which data is given and you<br />

desire to fit it with an expression of the form y = af (x)+bg (x)+c where the problem<br />

would be to find a, b and c in order to minimize the error. Could this be generalized<br />

to higher dimensions? How about more functions?<br />

24. Let A ∈L(X, Y )whereX and Y are finite dimensional vector spaces with the dimension<br />

of X equal to n. Define rank (A) ≡ dim (A (X)) and nullity(A) ≡ dim (ker (A)) .<br />

Show that nullity(A)+rank(A) =dim(X) . Hint: Let {x i } r i=1<br />

be a basis for ker (A)<br />

and let {x i } r i=1 ∪{y i} n−r<br />

i=1 be a basis for X. Then show that {Ay i} n−r<br />

i=1<br />

is linearly<br />

independent and spans AX.<br />

25. Let A be an m×n matrix. Show the column rank of A equals the column rank of A ∗ A.<br />

Next verify column rank of A ∗ A is no larger than column rank of A ∗ . Next justify the<br />

following inequality to conclude the column rank of A equals the column rank of A ∗ .<br />

rank (A) =rank (A ∗ A) ≤ rank (A ∗ ) ≤<br />

=rank (AA ∗ ) ≤ rank (A) .<br />

Hint: Start with an orthonormal basis, {Ax j } r j=1 of A (Fn ) and verify {A ∗ Ax j } r j=1<br />

is a basis for A ∗ A (F n ) .<br />

26. Let A be a real m × n matrix and let A = QR be the QR factorization with Q<br />

orthogonal and R upper triangular. Show that there exists a solution x to the equation<br />

R T Rx = R T Q T b<br />

and that this solution is also a least squares solution defined above such that A T Ax =<br />

A T b.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!