06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

15.2. THE QR ALGORITHM 397<br />

where R −1<br />

k<br />

is upper triangular. Also recall that from the definition of S in (15.15),<br />

S −1 AS = D<br />

and so the columns of S are eigenvectors of A, the i th being an eigenvector for λ i . Now<br />

from the form of L k , it follows L k R −1<br />

k<br />

is a block upper triangular matrix denoted by T B and<br />

so Q k = ST B . It follows from the above construction in (15.16) and the given assumption<br />

on the sizes of the eigenvalues, there are finitely many 2 × 2blockscenteredonthemain<br />

diagonal along with possibly some diagonal entries. Therefore, for large k the matrix<br />

is approximately of the same form as that of<br />

Q ∗ kAQ k = T −1<br />

B<br />

A k = Q (k)T AQ (k)<br />

S−1 AST B = T −1 DT B<br />

which is a block upper triangular matrix. As explained above, multiplication by the various<br />

diagonal unitary matrices does not affect this form. Therefore, for large k, A k is approximately<br />

a block upper triangular matrix.<br />

How would this change if the above assumption on the size of the eigenvalues were relaxed<br />

but the matrix was still nondefective with appropriate matrices having an LU factorization<br />

as above? It would mean the blocks on the diagonal would be larger. This immediately<br />

makes the problem more cumbersome to deal with. However, in the case that the eigenvalues<br />

of A are distinct, the above situation really is typical of what occurs and in any case can be<br />

quickly reduced to this case.<br />

To see this, suppose condition (15.19) is violated and λ j , ··· ,λ j+p are complex eigenvalues<br />

having nonzero imaginary parts such that each has the same absolute value but they<br />

are all distinct. Then let μ>0 and consider the matrix A + μI. Thus the corresponding<br />

eigenvalues of A + μI are λ j + μ, ··· ,λ j+p + μ. A short computation shows shows<br />

|λ j + μ| , ··· , |λ j+p + μ| are all distinct and so the above situation of (15.19) is obtained. Of<br />

course, if there are repeated eigenvalues, it may not be possible to reduce to the case above<br />

and you would end up with large blocks on the main diagonal which could be difficult to<br />

deal with.<br />

So how do you identify the eigenvalues? You know A k and behold that it is close to a<br />

block upper triangular matrix T B ′ . You know A k is also similar to A. Therefore, T B ′ has<br />

eigenvalues which are close to the eigenvalues of A k and hence those of A provided k is<br />

sufficiently large. See Theorem 7.9.2 which depends on complex analysis or the exercise on<br />

Page 197 which gives another way to see this. Thus you find the eigenvalues of this block<br />

triangular matrix T B ′ and assert that these are good approximations of the eigenvalues of<br />

A k and hence to those of A. How do you find the eigenvalues of a block triangular matrix?<br />

This is easy from Lemma 7.4.5. Say<br />

T ′ B =<br />

⎛<br />

⎜<br />

⎝<br />

B<br />

⎞<br />

B 1 ··· ∗<br />

. ..<br />

. ..<br />

⎟<br />

⎠<br />

0 B m<br />

Then forming λI − T B ′ and taking the determinant, it follows from Lemma 7.4.5 this equals<br />

m∏<br />

det (λI j − B j )<br />

j=1<br />

and so all you have to do is take the union of the eigenvalues for each B j .<br />

emphasized here this is very easy because these blocks are just 2 × 2 matrices.<br />

In the case

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!