06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

60 MATRICES AND LINEAR TRANSFORMATIONS<br />

Definition 2.4.8 A finite set of vectors, {x 1 , ··· , x r } is a basis for a subspace V of F n if<br />

span (x 1 , ··· , x r )=V and {x 1 , ··· , x r } is linearly independent.<br />

Corollary 2.4.9 Let {x 1 , ··· , x r } and {y 1 , ··· , y s } be two bases for V .Thenr = s.<br />

Proof: From the exchange theorem, r ≤ s and s ≤ r. <br />

Definition 2.4.10 Let V be a subspace of F n . Then dim (V ) read as the dimension of V<br />

is the number of vectors in a basis.<br />

Of course you should wonder right now whether an arbitrary subspace even has a basis.<br />

In fact it does and this is in the next theorem. First, here is an interesting lemma.<br />

Lemma 2.4.11 Suppose v /∈ span (u 1 , ··· , u k ) and {u 1 , ··· , u k } is linearly independent.<br />

Then {u 1 , ··· , u k , v} is also linearly independent.<br />

Proof: Suppose ∑ k<br />

i=1 c iu i + dv = 0. It is required to verify that each c i = 0 and<br />

that d =0. But if d ≠0, then you can solve for v as a linear combination of the vectors,<br />

{u 1 , ··· , u k },<br />

k∑ ( ci<br />

)<br />

v = − u i<br />

d<br />

i=1<br />

contrary to assumption. Therefore, d =0. But then ∑ k<br />

i=1 c iu i = 0 and the linear independence<br />

of {u 1 , ··· , u k } implies each c i = 0 also. <br />

Theorem 2.4.12 Let V be a nonzero subspace of F n .ThenV has a basis.<br />

Proof: Let v 1 ∈ V where v 1 ≠ 0. If span {v 1 } = V, stop. {v 1 } is a basis for V .<br />

Otherwise, there exists v 2 ∈ V which is not in span {v 1 } . By Lemma 2.4.11 {v 1 , v 2 } is a<br />

linearly independent set of vectors. If span {v 1 , v 2 } = V stop, {v 1 , v 2 } is a basis for V. If<br />

span {v 1 , v 2 } ̸= V, then there exists v 3 /∈ span {v 1 , v 2 } and {v 1 , v 2 , v 3 } is a larger linearly<br />

independent set of vectors. Continuing this way, the process must stop before n + 1 steps<br />

because if not, it would be possible to obtain n + 1 linearly independent vectors contrary to<br />

the exchange theorem. <br />

In words the following corollary states that any linearly independent set of vectors can<br />

be enlarged to form a basis.<br />

Corollary 2.4.13 Let V be a subspace of F n and let {v 1 , ··· , v r } be a linearly independent<br />

set of vectors in V . Then either it is a basis for V or there exist vectors, v r+1 , ··· , v s such<br />

that {v 1 , ··· , v r , v r+1 , ··· , v s } is a basis for V.<br />

Proof: This follows immediately from the proof of Theorem 2.4.12. You do exactly the<br />

same argument except you start with {v 1 , ··· , v r } rather than {v 1 }. <br />

It is also true that any spanning set of vectors can be restricted to obtain a basis.<br />

Theorem 2.4.14 Let V be a subspace of F n and suppose span (u 1 ··· , u p ) = V where<br />

the u i are nonzero vectors. Then there exist vectors {v 1 ··· , v r } such that {v 1 ··· , v r }⊆<br />

{u 1 ··· , u p } and {v 1 ··· , v r } is a basis for V .<br />

Proof: Let r be the smallest positive integer with the property that for some set<br />

{v 1 ··· , v r }⊆{u 1 ··· , u p } ,<br />

span (v 1 ··· , v r )=V.<br />

Then r ≤ p anditmustbethecasethat{v 1 ··· , v r } is linearly independent because if it<br />

were not so, one of the vectors, say v k would be a linear combination of the others. But<br />

then you could delete this vector from {v 1 ··· , v r } and the resulting list of r − 1 vectors<br />

would still span V contrary to the definition of r.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!