06.09.2021 Views

A First Course in Linear Algebra, 2017a

A First Course in Linear Algebra, 2017a

A First Course in Linear Algebra, 2017a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

236 R n<br />

We will say that the columns form an orthonormal set of vectors, and similarly for the rows. Thus<br />

amatrixisorthogonal if its rows (or columns) form an orthonormal set of vectors. Notice that the<br />

convention is to call such a matrix orthogonal rather than orthonormal (although this may make more<br />

sense!).<br />

Proposition 4.132: Orthonormal Basis<br />

The rows of an n × n orthogonal matrix form an orthonormal basis of R n . Further, any orthonormal<br />

basis of R n can be used to construct an n × n orthogonal matrix.<br />

Proof. Recall from Theorem 4.126 that an orthonormal set is l<strong>in</strong>early <strong>in</strong>dependent and forms a basis for<br />

its span. S<strong>in</strong>ce the rows of an n×n orthogonal matrix form an orthonormal set, they must be l<strong>in</strong>early <strong>in</strong>dependent.<br />

Now we have n l<strong>in</strong>early <strong>in</strong>dependent vectors, and it follows that their span equals R n . Therefore<br />

these vectors form an orthonormal basis for R n .<br />

Suppose now that we have an orthonormal basis for R n . S<strong>in</strong>ce the basis will conta<strong>in</strong> n vectors, these<br />

can be used to construct an n × n matrix, with each vector becom<strong>in</strong>g a row. Therefore the matrix is<br />

composed of orthonormal rows, which by our above discussion, means that the matrix is orthogonal. Note<br />

we could also have construct a matrix with each vector becom<strong>in</strong>g a column <strong>in</strong>stead, and this would aga<strong>in</strong><br />

be an orthogonal matrix. In fact this is simply the transpose of the previous matrix.<br />

♠<br />

Consider the follow<strong>in</strong>g proposition.<br />

Proposition 4.133: Determ<strong>in</strong>ant of Orthogonal Matrices<br />

Suppose U is an orthogonal matrix. Then det(U)=±1.<br />

Proof. This result follows from the properties of determ<strong>in</strong>ants. Recall that for any matrix A, det(A) T =<br />

det(A). NowifU is orthogonal, then:<br />

(det(U)) 2 = det ( U T ) det(U)=det ( U T U ) = det(I)=1<br />

Therefore (det(U)) 2 = 1 and it follows that det(U)=±1.<br />

♠<br />

Orthogonal matrices are divided <strong>in</strong>to two classes, proper and improper. The proper orthogonal matrices<br />

are those whose determ<strong>in</strong>ant equals 1 and the improper ones are those whose determ<strong>in</strong>ant equals<br />

−1. The reason for the dist<strong>in</strong>ction is that the improper orthogonal matrices are sometimes considered to<br />

have no physical significance. These matrices cause a change <strong>in</strong> orientation which would correspond to<br />

material pass<strong>in</strong>g through itself <strong>in</strong> a non physical manner. Thus <strong>in</strong> consider<strong>in</strong>g which coord<strong>in</strong>ate systems<br />

must be considered <strong>in</strong> certa<strong>in</strong> applications, you only need to consider those which are related by a proper<br />

orthogonal transformation. Geometrically, the l<strong>in</strong>ear transformations determ<strong>in</strong>ed by the proper orthogonal<br />

matrices correspond to the composition of rotations.<br />

We conclude this section with two useful properties of orthogonal matrices.<br />

Example 4.134: Product and Inverse of Orthogonal Matrices<br />

Suppose A and B are orthogonal matrices. Then AB and A −1 both exist and are orthogonal.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!