12.07.2015 Views

Linear Algebra

Linear Algebra

Linear Algebra

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

102 Chapter Two. Vector Spaces1.2 Example In R 3 , where⎛ ⎞⎛⎞⎛⎞⃗v 1 = ⎝ 1 0⎠ ⃗v 2 = ⎝ 0 1⎠ ⃗v 3 = ⎝ 2 1⎠000the spans [{⃗v 1 , ⃗v 2 }] and [{⃗v 1 , ⃗v 2 , ⃗v 3 }] are equal since ⃗v 3 is in the span [{⃗v 1 , ⃗v 2 }].The lemma says that if we have a spanning set then we can remove a ⃗v toget a new set S with the same span if and only if ⃗v is a linear combination ofvectors from S. Thus, under the second sense described above, a spanning setis minimal if and only if it contains no vectors that are linear combinations ofthe others in that set. We have a term for this important property.1.3 Definition A subset of a vector space is linearly independent if noneof its elements is a linear combination of the others. Otherwise it is linearlydependent.Here is an important observation: although this way of writing one vectoras a combination of the others⃗s 0 = c 1 ⃗s 1 + c 2 ⃗s 2 + · · · + c n ⃗s nvisually sets ⃗s 0 off from the other vectors, algebraically there is nothing specialin that equation about ⃗s 0 . For any ⃗s i with a coefficient c i that is nonzero, wecan rewrite the relationship to set off ⃗s i .⃗s i = (1/c i )⃗s 0 + (−c 1 /c i )⃗s 1 + · · · + (−c n /c i )⃗s nWhen we don’t want to single out any vector by writing it alone on one side of theequation we will instead say that ⃗s 0 , ⃗s 1 , . . . , ⃗s n are in a linear relationship andwrite the relationship with all of the vectors on the same side. The next resultrephrases the linear independence definition in this style. It gives what is usuallythe easiest way to compute whether a finite set is dependent or independent.1.4 Lemma A subset S of a vector space is linearly independent if and only iffor any distinct ⃗s 1 , . . . , ⃗s n ∈ S the only linear relationship among those vectorsc 1 ⃗s 1 + · · · + c n ⃗s n = ⃗0c 1 , . . . , c n ∈ Ris the trivial one: c 1 = 0, . . . , c n = 0.Proof. This is a direct consequence of the observation above.If the set S is linearly independent then no vector ⃗s i can be written as a linearcombination of the other vectors from S so there is no linear relationship wheresome of the ⃗s ’s have nonzero coefficients. If S is not linearly independent thensome ⃗s i is a linear combination ⃗s i = c 1 ⃗s 1 +· · ·+c i−1 ⃗s i−1 +c i+1 ⃗s i+1 +· · ·+c n ⃗s n ofother vectors from S, and subtracting ⃗s i from both sides of that equation givesa linear relationship involving a nonzero coefficient, namely the −1 in front of⃗s i .QED

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!