# An Elementary Proof of Stone's Theorem for Unitary Matrices Stone ...

An Elementary Proof of Stone's Theorem for Unitary Matrices Stone ...

An Elementary Proof of Stone’s Theorem for Unitary MatricesStone Theorem. Let U(·) be a measurable, matrix-valued function mapping from R into the n×n unitarymatrices, such that, for all real numbers s and t, U(s + t) = U(s)U(t). There is a self-adjoint matrix H suchthat U(t) = e iHt . In particular, there is a unitary matrix V and there are real numbers λ 1 , . . . , λ n suchthatU(t) = V E(t)V ∗ ,where E(t) is the diagonal matrix with main-diagonal entries e iλ1t , . . . , e iλnt . The matrix H has the formH = V NV ∗ ,where N is the diagonal matrix with main-diagonal entries λ 1 , . . . , λ n .Warm-Up Theorem. Let f : R ↦→ C be a measurable function that is bounded on bounded intervals.Suppose that f satisfies f(0) = 1 and, for all real s and t, f(s + t) = f(s)f(t). Then f(t) ≡ e αt for somecomplex number α.Proof of WUT. For any t ∈ R,Therefore, for any t and t ′ ,f(t) =|f(t) − f(t ′ )| ≤ M∫ 1/2−1/2∫ 1/2−1/2f(t − s) f(s) ds.|f(t − s) − f(t ′ − s)| ds,where M is the supremum of |f| on [−1/2, 1/2]. This goes to zero as t ′ → t. Therefore f is continuous.Let δ > 0 be so small that |t| < δ implies that |1 − f(t)| < 1/10, and fix some t 0 ∈ (0, δ). We can writefor some unique θ 0 ∈ (−π/4, π/4). Definef(t 0 ) = exp(log |f(t 0 )| + iθ 0 )β ≡ log |f(t 0 )| + iθ 0 .Now, what is f(t 0 /2)? It must satisfy (f(t 0 /2)) 2 = f(t 0 ), and the two numbers that do this are exp(β/2) and− exp(β/2). But f(t 0 /2) cannot be further than a distance of 1/10 from 1, so the second root is out. Thusf(t 0 /2) = exp(β/2). Continuing this way, we see that f(t 0 /2 k ) = exp(β/2 k ) for all non-negative integersk. The law ‘f(s + t) = f(s)f(t)’ trivially implies that f(mt 0 ) = exp(mβ) for all integers m. It’s now aneasy matter to show that f(rt 0 ) = exp(rβ) for any dyadic rational r; i.e., any number of the form r = n/2 j .Since the dyadic rationals are dense in R, and f and the exponential function are continuous, we get thatf(rt 0 ) = exp(rβ) for all real numbers r. If t ∈ R is arbitrary, we can write f(t) = f(rt 0 ) = exp(rβ) forr = t/t 0 , implying f(t) = exp(αt) for α = β/t 0 . The WUT is proved.Lemma. Let U be an n × n normal matrix with eigenvalues (counted according to multiplicity) λ 1 , . . . ,λ n . If I is the n × n identity matrix, then the operator norm of U − I equals max i |1 − λ i |.Proof of Lemma. The matrix U has an orthonormal basis of eigenvectors u 1 , . . . , u n , where weassume they are ordered in the same way as the λ i ’s. If x ∈ C n thenUx =n∑λ i 〈x, u i 〉u i11

andTherefore‖(U − I)x‖ 2 =Ix = x =n∑〈x, u i 〉u i .1n∑|λ i − 1| 2 |〈x, u i 〉| 21() 2 ∑ n≤ max |1 − λ i | |〈x, u i 〉| 2i1(2= max |1 − λ i |)‖x‖ 2 ,iimplying ‖U − I‖ ≤ max i |1 − λ i |. We get equality by applying U − I to the appropriate eigenvector u i .Lemma. Suppose that U and V are two normal n × n matrices such that U 2 = V and all of U’s eigenvalueshave positive real parts. Then U and V have the same eigenvectors: any v ∈ C n is an eigenvector for U ifand only if it’s an eigenvector for V .Remark. Some restriction on U’s eigenvalues is required. Consider:[ ]1 0U ≡0 −1[ ]1 0V ≡ .0 1Both are normal (they’re even unitary), and U 2 = V , but any non-zero vector is an eigenvector for V , whichobviously isn’t true for U.Proof of Lemma. Any eigenvector for U is trivially one for V . We show the other direction. Let Uhave an orthonormal basis of eigenvectors u 1 , . . . , u n , with corresponding eigenvalues λ 1 , . . . , λ n . Let xbe an eigenvector for V with eigenvalue γ. We can writewhich impliesU 2 x =x =n∑〈x, u i 〉u i ,n∑λ 2 i 〈x, u i 〉u i = V x = γx =11n∑γ〈x, u i 〉u i .This can only happen if 〈x, u i 〉 = 0 for those i’s such that λ 2 i ≠ γ. The function z ↦→ z2 is one-to-one on theset of z’s with positive real parts (where all the λ i ’s lie). Therefore those λ i ’s for which 〈x, u i 〉 ≠ 0 (forcingλ 2 i = γ) are equal to the same number—call it λ. So we get:x = ∑〈x, u i 〉u ii:λ i=λ1andand x is an eigenvector for U.Ux = ∑i:λ i=λλ〈x, u i 〉u i = λx,Proof of Stone Theorem. The proof that U(t) is continuous is just like the proof of f’s continuityin the WUT. The product law U(s + t) = U(s)U(t) implies that U(0) = I. Let δ > 0 be such that |t| < δimplies ‖U(t) − I‖ < 1/10, and fix a t 0 ∈ (0, δ). We can writeU(t 0 ) = V L 0 V ∗ ,2

where V is unitary and L 0 is a diagonal matrix with entries e iθ1 , . . . , e iθn , and each θ j lies in (−π/4, π/4)(because |1 − e iθj | < 1/10). Important: all of U(t 0 )’s eigenvalues have positive real parts! Now, U(t 0 ) =U(t 0 /2) 2 , and ‖U(t 0 /2) − I‖ < 1/10 as well, which means that U(t 0 /2)’s eigenvalues all have positive realparts too. Therefore V ’s columns—which are U(t 0 )’s orthonormal eigenvectors—are also eigenvectors forU(t 0 /2), and we can writeU(t 0 /2) = V L 1 V ∗for some diagonal matrix L 1 . This L 1 must satisfy (L 1 ) 2 = L 0 , and its diagonal entries γ j are U(t 0 /2)’seigenvalues: γj 2 = . The γ eiθj j ’s have to be no further than 1/10 from 1, and therefore they have to satisfyγ j = e iθj/2 . Continuing as in the WUT, U(t 0 /2 k ) will equal V L k V ∗ , where L k is a diagonal matrix withmain-diagonal entries e iθj/2k . Following the pattern of the WUT, we see that U(t) will have the formV L(t)V ∗ , where L(t) is diagonal, with main-diagonal entries e i(θj/t0)t . The theorem is proved.3

More magazines by this user
Similar magazines