06.09.2021 Views

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

Linear Algebra, Theory And Applications, 2012a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

366 NORMS FOR FINITE DIMENSIONAL VECTOR SPACES<br />

Next show u (t) =cos(bt) andv (t) =sin(bt) work in the above and that there is at<br />

most one solution to<br />

w ′′ + b 2 w =0w (0) = α, w ′ (0) = β.<br />

Thus z (t) =cos(bt)+i sin (bt) andsoy (t) =e at (cos (bt)+i sin (bt)). To show there<br />

is at most one solution to the above problem, suppose you have two, w 1 ,w 2 . Subtract<br />

them. Let f = w 1 − w 2 . Thus<br />

f ′′ + b 2 f =0<br />

and f is real valued. Multiply both sides by f ′ and conclude<br />

d<br />

dt<br />

(<br />

(f ′ ) 2<br />

2<br />

+ b 2 f 2<br />

2<br />

)<br />

=0<br />

Thus the expression in parenthesis is constant. Explain why this constant must equal<br />

0.<br />

17. Let A ∈L(R n , R n ) . Show the following power series converges in L (R n , R n ).<br />

∞∑ t k A k<br />

k=0<br />

You might want to use Lemma 14.4.2. This is how you can define exp (tA). Next show<br />

using arguments like those of Corollary 14.4.3<br />

d<br />

exp (tA) =A exp (tA)<br />

dt<br />

so that this is a matrix valued solution to the differential equation and initial condition<br />

k!<br />

Ψ ′ (t) =AΨ(t) , Ψ (0) = I.<br />

This Ψ (t) is called a fundamental matrix for the differential equation y ′ = Ay. Show<br />

t → Ψ(t) y 0 gives a solution to the initial value problem<br />

y ′ = Ay, y (0) = y 0 .<br />

18. In Problem 17 Ψ (t) is defined by the given series. Denote by exp (tσ (A)) the numbers<br />

exp (tλ) whereλ ∈ σ (A) . Show exp (tσ (A)) = σ (Ψ (t)) . This is like Lemma 14.4.7.<br />

Letting J be the Jordan canonical form for A, explain why<br />

∞∑ t k A k<br />

Ψ(t) ≡<br />

k!<br />

k=0<br />

= S<br />

∞∑<br />

k=0<br />

t k J k<br />

S −1<br />

k!<br />

and you note that in J k , the diagonal entries are of the form λ k for λ an eigenvalue<br />

of A. AlsoJ = D + N where N is nilpotent and commutes with D. Argue then that<br />

∞∑<br />

k=0<br />

t k J k<br />

k!<br />

is an upper triangular matrix which has on the diagonal the expressions e λt where<br />

λ ∈ σ (A) . Thus conclude<br />

σ (Ψ (t)) ⊆ exp (tσ (A))

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!