06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

130 SOME FACTORIZATIONS<br />

It is customary to write this more simply as<br />

A = PLU<br />

where L is an upper triangular matrix having all ones on the diagonal and P is a permutation<br />

matrix consisting of P 1 ···P n−1 as described above. This proves the following theorem.<br />

Theorem 5.6.1 Let A be any invertible n × n matrix. Then there exists a permutation<br />

matrix P and a lower triangular matrix L having all ones on the main diagonal and an<br />

upper triangular matrix U such that<br />

5.7 The QR Factorization<br />

A = PLU<br />

As pointed out above, the LU factorization is not a mathematically respectable thing because<br />

it does not always exist. There is another factorization which does always exist.<br />

Much more can be said about it than I will say here. At this time, I will only deal with real<br />

matrices and so the inner product will be the usual real dot product.<br />

Definition 5.7.1 An n × n real matrix Q is called an orthogonal matrix if<br />

QQ T = Q T Q = I.<br />

Thus an orthogonal matrix is one whose inverse is equal to its transpose.<br />

Thus<br />

First note that if a matrix is orthogonal this says<br />

∑<br />

Q T ijQ jk = ∑ Q ji Q jk = δ ik<br />

j<br />

j<br />

⎛ ⎞<br />

|Qx| 2 = ∑ ⎝ ∑ 2<br />

Q ij x j<br />

⎠ = ∑ ∑ ∑<br />

Q is x s Q ir x r<br />

i j<br />

i r s<br />

= ∑ ∑ ∑<br />

Q is Q ir x s x r = ∑ ∑ ∑<br />

Q is Q ir x s x r<br />

i r s<br />

r s i<br />

= ∑ ∑<br />

δ sr x s x r = ∑ x 2 r = |x| 2<br />

r s<br />

r<br />

This shows that orthogonal transformations preserve distances. You can show that if you<br />

have a matrix which does preserve distances, then it must be orthogonal also.<br />

Example 5.7.2 One of the most important examples of an orthogonal matrix is the so<br />

called Householder matrix. You have v a unit vector and you form the matrix<br />

I − 2vv T<br />

This is an orthogonal matrix which is also symmetric. To see this, you use the rules of<br />

matrix operations.<br />

(<br />

I − 2vv<br />

T ) T<br />

= I T − ( 2vv T ) T<br />

= I − 2vv T

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!