06.09.2021 Views

A First Course in Linear Algebra, 2017a

A First Course in Linear Algebra, 2017a

A First Course in Linear Algebra, 2017a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

88 Matrices<br />

Proof. Suppose that A and B are matrices such that both products AB and BA are identity matrices. We will<br />

show that A and B must be square matrices of the same size. Let the matrix A have m rows and n columns,<br />

so that A is an m × n matrix. S<strong>in</strong>ce the product AB exists, B must have n rows, and s<strong>in</strong>ce the product BA<br />

exists, B must have m columns so that B is an n × m matrix. To f<strong>in</strong>ish the proof, we need only verify that<br />

m = n.<br />

We first apply Lemma 2.60 with A and B, to obta<strong>in</strong> the <strong>in</strong>equality m ≤ n. We then apply Lemma 2.60<br />

aga<strong>in</strong> (switch<strong>in</strong>g the order of the matrices), to obta<strong>in</strong> the <strong>in</strong>equality n ≤ m. It follows that m = n, aswe<br />

wanted.<br />

♠<br />

Of course, not all square matrices are <strong>in</strong>vertible. In particular, zero matrices are not <strong>in</strong>vertible, along<br />

with many other square matrices.<br />

The follow<strong>in</strong>g proposition will be useful <strong>in</strong> prov<strong>in</strong>g the next theorem.<br />

Proposition 2.62: Reduced Row-Echelon Form of a Square Matrix<br />

If R is the reduced row-echelon form of a square matrix, then either R has a row of zeros or R is an<br />

identity matrix.<br />

The proof of this proposition is left as an exercise to the reader. We now consider the second important<br />

theorem of this section.<br />

Theorem 2.63: Unique Inverse of a Matrix<br />

Suppose A and B are square matrices such that AB = I where I is an identity matrix. Then it follows<br />

that BA = I. Further, both A and B are <strong>in</strong>vertible and B = A −1 and A = B −1 .<br />

Proof. Let R be the reduced row-echelon form of a square matrix A. Then, R = EA where E is an <strong>in</strong>vertible<br />

matrix. S<strong>in</strong>ce AB = I, Lemma 2.59 gives us that R does not have a row of zeros. By not<strong>in</strong>g that R is a<br />

square matrix and apply<strong>in</strong>g Proposition 2.62, we see that R = I. Hence, EA = I.<br />

Us<strong>in</strong>g both that EA = I and AB = I, we can f<strong>in</strong>ish the proof with a cha<strong>in</strong> of equalities as given by<br />

BA = IBIA = (EA)B(E −1 E)A<br />

= E(AB)E −1 (EA)<br />

= EIE −1 I<br />

= EE −1 = I<br />

It follows from the def<strong>in</strong>ition of the <strong>in</strong>verse of a matrix that B = A −1 and A = B −1 .<br />

♠<br />

This theorem is very useful, s<strong>in</strong>ce with it we need only test one of the products AB or BA <strong>in</strong> order to<br />

check that B is the <strong>in</strong>verse of A. The hypothesis that A and B are square matrices is very important, and<br />

without this the theorem does not hold.<br />

We will now consider an example.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!