06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

310 SELF ADJOINT OPERATORS<br />

It follows easily that ˜F is also a commuting family of diagonalizable matrices. By<br />

Lemma 13.1.7 every B ∈ ˜F is of the form given in (13.2) because each of these commutes<br />

with D described above as S −1 AS and so by block multiplication, the diagonal blocks B i<br />

corresponding to different B ∈ ˜F commute.<br />

By Corollary 13.1.4 each of these blocks is diagonalizable. This is because B is known to<br />

be so. Therefore, by induction, since all the blocks are no larger than n−1×n−1 thanks to<br />

the assumption that A has more than one eigenvalue, there exist invertible n i × n i matrices,<br />

T i such that T −1<br />

i B i T i is a diagonal matrix whenever B i is one of the matrices making up<br />

the block diagonal of any B ∈F. It follows that for T defined by<br />

⎛<br />

⎞<br />

T 1 0 ··· 0<br />

.<br />

T ≡<br />

0 T .. . 2 ⎜<br />

⎝<br />

.<br />

. .. . ⎟ .. 0 ⎠ ,<br />

0 ··· 0 T r<br />

then T −1 BT = a diagonal matrix for every B ∈ ˜F including D. Consider ST. It follows<br />

that for all C ∈F,<br />

something in<br />

{ }} {<br />

˜F<br />

T −1 S −1 CS T =(ST) −1 C (ST) = a diagonal matrix. <br />

Theorem 13.1.9 Let F denote a family of matrices which are diagonalizable. Then F is<br />

simultaneously diagonalizable if and only if F is a commuting family.<br />

Proof: If F is a commuting family, it follows from Lemma 13.1.8 that it is simultaneously<br />

diagonalizable. If it is simultaneously diagonalizable, then it follows from Lemma 13.1.6 that<br />

it is a commuting family. <br />

13.2 Schur’s Theorem<br />

Recall that for a linear transformation, L ∈L(V,V )forV a finite dimensional inner product<br />

space, it could be represented in the form<br />

L = ∑ ij<br />

l ij v i ⊗ v j<br />

where {v 1 , ··· , v n } is an orthonormal basis. Of course different bases will yield different<br />

matrices, (l ij ) . Schur’s theorem gives the existence of a basis in an inner product space such<br />

that (l ij ) is particularly simple.<br />

Definition 13.2.1 Let L ∈L(V,V ) where V is vector space. Then a subspace U of V is L<br />

invariant if L (U) ⊆ U.<br />

In what follows, F will be the field of scalars, usually C but maybe something else.<br />

Theorem 13.2.2 Let L ∈L(H, H) for H a finite dimensional inner product space such<br />

that the restriction of L ∗ to every L invariant subspace has its eigenvalues in F. Then there<br />

exist constants, c ij for i ≤ j and an orthonormal basis, {w i } n i=1<br />

such that<br />

L =<br />

n∑<br />

j=1 i=1<br />

The constants, c ii are the eigenvalues of L.<br />

j∑<br />

c ij w i ⊗ w j

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!