11.07.2015 Views

Z t Z t Z t Z t

Z t Z t Z t Z t

Z t Z t Z t Z t

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

J. Korean Math. Soc. 35 (1998), No. 3, pp. 713{725A SHARP BOUND FOR IT ^O PROCESSESChangsun ChoiAbstract. Let X and Y be It^o processes with dX s = ' s dB s +sds and dY s = s dB s + s ds. Burkholder obtained a sharp boundon the distribution of the maximal function of Y under the assumptionthat jY 0 jjX 0 j,jjj'j,jjj j, and that X is a nonnegativelocal submartingale. In this paper we consider a wider classof It^o processes, replace the assumption jj j jby a more generalone jj j j, where 0 is a constant, and get a weak-type inequalitybetween X and the maximal function of Y . This inequality,being sharp for all 0, extends the work by Burkholder.1. IntroductionLet (; F;P) be a complete probability space with a right-continuousltration (F t ) t0 such that A 2 F 0 whenever A 2 F and P (A) = 0.The adapted real Brownian motion B = (B t ) t0 starts at 0 and theprocess (B t , B s ) ts is independent of F s for all s 0.Let ' and be real predictable processes such thatZ t (1.1) P j' s j 2 + j s j ds < 1 for all t>0 =1:0Also, let and be R -valued predictable processes, where is apositive integer. We assume the condition (1:1) for and . The It^oprocesses X and Y are dened by(1.2)8>:X t = X 0 +Y t = Y 0 +Z t0Z t0' s dB s + s dB s +Z t0Z t0s ds; s ds:Received February 24, 1998.1991 Mathematics Subject Classication: Primary 60H05; Secondary 60E15.Key words and phrases: It^o process, -subordinate, It^o's formula, Doob's optionalsampling theorem, stopping time, martingale, submartingale, Brownian motion,exit time, strong Markov property, best constant.


714 Changsun ChoiWe assume that X 0 is constant and that X and Y are continuous.We set Y = sup t0 jY t j and kXk = sup E jX j where the supremumis taken over all bounded stopping times .The following inequality is due to Burkholder (1993).Burkholder (1994) for related inequalities.thenAlso seeTheorem 1.1. If X 0, 0, jY 0 jjX 0 j,jjj'jand jj j j,and 3 is best possible.P (Y ) 3kXk for all >02. A sharp probability boundLet (; F;P) and (F t ) t0 be as in the introduction. The adaptedsquare integrable real martingale M starts at 0 and, for all s 0, theprocess (M t , M s ) ts is independent of F s . Let hMi be the quadraticvariational process of M. The adapted integrable increasing process Astarts at 0. We assume that M and A are continuous. Thus hMi isalso continuous.We follow Ikeda and Watanabe (1981) for notions of stochastic processes.Thus, increasing in the above means non-decreasing. Termslike positive, negative and decreasing, will be used similarly. Also, onemay see the same book for the basic facts of stochastic processes andstochastic integrals.Consider real predictable processes ' and such that for all t>0(2.1)Z t0j' s j 2 d hMi s< 1 andZ t0j s jdA s < 1:Let H be a Hilbert space over R. For x; y 2 H we denote by x y theinner product of x and y and put jxj 2 = xx. The H -valued predictableprocesses and satisfy the condition (2:1). The It^o processes X andY are dened by(2.2)8>:X t = X 0 +Y t = Y 0 +Z t0Z t0' s dM s + s dM s +Z t0Z t0s dA s ; s dA s :


A sharp bound for It^o processes 715We assume that X 0 is constant and that X and Y are continuous.Let Y and kXk be as in the introduction.ifDefinition 2.1. For 0we dene that Y is -subordinate to X(2.3) jY 0 jjX 0 j;(2.4) jj j'j;(2.5) jj j j:Theorem 2.2. If X 0, 0, and Y is -subordinate to X, then(2.6) P (Y ) ( +2)kXk for all >0and the constant +2 is best possible.Proof of the inequality. In order to make the key points of theproof clear we defer some technical details to Section 3 and use someunproved claims and lemmas in this proof.We may assume that =1and ( +2)kXk 0 and jY j > 0.Claim 2.4. It suces to prove(2.7) P (jY j1) ( +2)kXkwhenever is a bounded stopping time.Let be a bounded stopping time.prove the stronger inequalityAs a matter of fact, we will(2.8) P (X + jY j1) ( +2)kXk:Wemay assume that the process X +jY j can be stopped on the surfacex + jyj =1.


716 Changsun ChoiClaim 2.5. It suces to prove(2.9) P (X + jY j =1)(+2)E X whenever is a bounded stopping time such that(2.10) EZ 0j' s j 2 d hMi s< 1;(2.11) X t + jY t j1 if 0 t :Put S = f(x; y) :x>0 and y 2 HU and V on S bywith jyj > 0g and dene functions(2.12) U(x; y) = , jyj,(+1)x , x + jyj 1=(+1)and ,(+2)x if x + jyj < 1;(2.13) V (x; y) =1 , ( +2)x if x + jyj 1:Let be a bounded stopping time satisfying (2.10) and (2.11). ByClaim 2.3 we have (X ;Y ) 2 S. And, from (2.11) and (2.13) we seethat the inequality (2.9) is equivalent to the inequality(2.14) E V (X ;Y )0:Lemma 2.6. (a) If x + jyj 1,then V (x; y) U(x; y).(b) If x jyj, then U(x; y) 0.From (2.11) and (a) of Lemma 2.6 wehaveE V(X ;Y ) E U(X ;Y ).Also, jY 0 jjX 0 jfrom (2.3) and X is positive, thus (b) of Lemma 2.6implies that E U(X 0 ;Y 0 ) 0. Hence, the inequality (2.14) follows fromthe inequality(2.15) E U(X ;Y ) E U(X 0 ;Y 0 ):


A sharp bound for It^o processes 717Since is bounded and U is smooth, we may use It^o's formula to get(2.16)U(X ;Y )=U(X 0 ;Y 0 )+++ 1 2Z 0Z 0Z U x (X s ;Y s )' s +U y (X s ;Y s ) sdM sU x (X s ;Y s ) s +U y (X s ;Y s ) sdA s0U xx (X s ;Y s )j' s j 2 +2U xy (X s ;Y s )' s s+U yy (X s ;Y s ) s sdhMi s:Here U yy (X s ;Y s ) can be regarded as a linear transformation from Hto H . For dierentiation of vector functions one may see Lang (1968).The inequality (2.15) follows if we show that the above three integralsin (2.16) have negative expectations.Lemma 2.7. (a) U x (x; y)+jU y (x; y)j0 for all (x; y) 2 S.(b) jU x (x; y)j + jU y (x; y)j+2 if (x; y) 2 S and x + jyj 1.(c) U xx (x; y)jhj 2 +2U xy (x; y) hk + U yy (x; y)k k(x; y) 2 S, h 2 R, k 2 H and jhj jkj. 0 wheneverThe rst integral in (2.16) has zero expectation because is boundedand the processZ ^tt !U x (X s ;Y s )' s +U y (X s ;Y s ) s dM s0is a martingale starting at 0; for this observe that j s jj' s jfrom (2.4)and that (2.10), (2.11), (b) of Lemma 2.7 and the Cauchy-Schwarzinequality implyE EZ 0Z 0 (+2) 2 EU x (X s ;Y s )' s +U y (X s ;Y s ) s 2dhMisjU x (X s ;Y s )j+jU y (X s ;Y s )j 2j'sj 2 dhMi sZ 0j' s j 2 dhMi s


718 Changsun ChoiThe rest two integrals in (2.16) have negative integrands; thus theyhave negative expectations because the processes A and hMi are increasing.Since 0 and jj j jfrom (2.5), we use the Cauchy-Schwarzinequality and (a) of Lemma 2.7 to getU x (X s ;Y s ) s +U y (X s ;Y s ) s U x (X s ;Y s ) s +jU y (X s ;Y s )jj s j U x (X s ;Y s )+jU y (X s ;Y s )j s 0:Similarly, the integrand of the third integral is negative because (X s ;Y s )2S from Claim 2.4 and j s jj' s jfrom (2.4): put x = X s ;y =Y s ;h=' s , k = s and apply (c) of Lemma 2.7.This proves the inequality inTheorem 2.2 under the assumption ofClaim 2.3, Claim 2.4, Claim 2.5, Lemma 2.6 and Lemma 2.7. We willelaborate on these claims and lemmas in Section 3. In Section 4 weconstruct an example which shows that +2is the best constant.3. Proof of claims and lemmasProof of Claim 2.3. Let It^o processes X and Y satisfy the assumptionsof Theorem 2.2. For each > 0, the new processes X + and(Y;), where (Y;) is H R-valued, satisfy the extra assumption inClaim 2.3 as well as the assumptions in Theorem 2.2. Assuming theinequality (2.6) for these new processes with =1wehave(3.1) P ((Y;) 1) ( +2)kX+k:Notice that Y (Y;) and kX + k = kXk + . Thus, (3.1) yields as ! 0 the inequality P (Y 1) ( +2)kXk, proving Claim 2.3. Proof of Claim 2.4. Dene a stopping time by = infft > 0 :jY t j > 1g. Since jY 0 j jX 0 j from (2.3) and ( +2)kXk 0; recall that jX 0 j is constant, thus jX 0 j = E jX 0 jkXk 1, then < 1 andjY j = 1. Here we used the continuity of Y . Assuming the inequality(2.7) for ^ n, we get(3.2) P (jY ^n j =1)=P(jY ^n j1) ( +2)kXk:


A sharp bound for It^o processes 719Also, using Fatou's lemma we have(3.3) P (Y > 1) P (jY j =1and < 1) lim infn!1 P (jY ^nj =1):From (3.2) and (3.3) we haveP (Y > 1) ( +2)kXk;from which we get P (Y 1) ( +2)kXk; rst consider (1 + 1=n)Xand (1+1=n)Y , and let n !1. This proves Claim 2.4. Proof of Claim 2.5. Let be a bounded stopping time. We denestopping times and n by = infft >0:X t +jY t j1gand Z t n = inf t>0: j' s j 2 dhMi s>n :0Here is a stopping time because X and Y are continuous. From (2.3),the assumption that ( +2)kXk< 1, and the assumption that X 0 0is constant, we have X 0 + jY 0 j 2X 0 = 2E X 0 2kXk < 1: Thus,X t + jY t j 1 if 0 t . The assumption (2.1) implies n " 1.Put n = ^ ^ n . Observe thatRthe stopping time n satises allnthe conditions in Claim 2.5: here E j'0 s j 2 d hMi s E n: Assuming(2.9) for n , we haveP (X n + jY n j =1)(+2)EX n (+2)kXk:Observe that if X + jY j1, then and = ^ . Thus, Fatou'slemma givesP (X + jY j1) P (X ^ + jY ^ j =1)lim infn!1 P (X n+ jY n j =1)(+2)kXk;which yields (2.8) and completes the proof of the Claim 2.5.


720 Changsun ChoiProof of Lemma 2.6. Let (x; y) 2 S.Proof of (a). We may assume x + jyj < 1 because U(x; y) =V(x; y) ifx+jyj=1. Write x + jyj = r +1 . Since 0 0g byG(t) =U(x+th; y + tk):Observe that I is an open set, 0 2 I and that G(t) is smooth on I. Bythe chain rule one hasG 00 (0) = U xx (x; y)jhj 2 +2U xy (x; y) hk + U yy (x; y)k k:Hence the proof is complete if we can check G 00 (0) 0. If no confusionarises, we will not write the argument t 2 I. On I dene more functionsK, Q and R by K = K(t) = x + th; Q = jy + tkj and R = K + Q:WritingG = R (+2)=(+1) , ( +2)KR 1=(+1) ;we getG 0 = +2+1 R0 R 1=(+1) , ( +2)hR 1=(+1) , +2+1 KR0 R ,=(+1) ;


A sharp bound for It^o processes 721andG 00 = R 00 R 2 + 1 +1 (R0 ) 2 R,2hR 0 R , KR 00 R + +1 K(R0 ) 2 ;where = +1+2 R(2+1)=(+1) :Rearranging terms and inserting (R 0 ) 2 R , R(R 0 ) 2 ,wehaveG 00 =R 00 R , KR 00 , 2hR 0 +(R 0 ) 2 R+,R+ 1 +1 R+ +1 K (R 0 ) 2=(jkj 2 ,jhj 2 )R,+1 Q(R0 ) 2 0:Here we used the observation that K 0 = h, Q 0 = R 0 , h, QQ 0 =k (y + tk) and QR 00 = QQ 00 = jkj 2 , (Q 0 ) 2 : Putting t = 0 we getG 00 (0) 0 and this proves (b) of Lemma 2.7.4. About the Best ConstantLet (; F;P), (F t ) t0 and B = (B t ) t0 be as in the introduction.Let 0 and 0 0.We will need to consider sequences (a n ) n1 , (b n ) n1 and (c n ) n1 .They satisfy a 1 = b 1 =1=2;c 1 = 0, and if n>1, then a n + b n + c n =1and(4.1) a n = n , 1n a n,1 +a n,1( +1)n(2n , 1) + b n,1 + c n,1( +1)n :Proposition 4.1. lim n!1 a n =1=(+2).


722 Changsun ChoiProof.Noting b n,1 + c n,1 =1,a n,1 ,wecomputea n = a n,1 1 ,With t n = a n , 1=( +2)one hast n = t n,1 1 ,( + 1)(2n , 1) + 2(n , 1)( +1)n(2n , 1)!+( + 1)(2n , 1) + 2(n , 1)( +1)n(2n , 1)1+( + 1)( +2)n(2n , 1)1. Also, 0 t 1 = = , 2( +2) 0 for all n>1.Thus, if 1 4n,2 :B s ,B 4n,2621n! , 2 +1 ;2n, 2 :+1:


A sharp bound for It^o processes 723Finally, put 4N+1=1+ 4N.Observe, from the strong Markov property ofthe Brownian motionand the basic facts of exit times of the Brownian motion, that n isnite almost surely, and that(4.4)P (B 4n , B 4n,1 = ,2n +1)=1,P(B 4n ,B 4n,1 =1)=1=(2n);P (B 4n,3 , B 4n,4 = ,2n +2)=1,P(B 4n,3,B 4n,4 =1)=1=(2n , 1);PB 4n,1 , B = , 2 4n,2 +1=1,PB 4n,1,B =2n, 2 4n,2 =2n, 2 1+1+1 2n :We write sgn x=1 if x > 0 and sgn x = ,1 if x < 0. Also, write(x; y)c for the scalar multiplication c(x; y) and 1 A for the indicatorrandom variable on the set A.It^o processes X and Y are dened by X 0 = Y 0 = 1 and the formula(1:2). Here we dene ', , and as follows:(4.5)if 0


724 Changsun ChoiDene a n , b n and c n for 1 n N bya n = P ((X 4n ;Y 4n )=(2n; 0));b n = P ((X 4n ;Y 4n )=(0;2n));c n = P ((X 4n ;Y 4n )=(0;,2n)):Then we havea 1 =b 1 =1=2, c 1 = 0 and for 1


A sharp bound for It^o processes 725(4N; 2N) and (4N;,2N) with probability b N +a N =4, c N +a N =4, a N =4and a N =4, respectively. ThusP (Y 2N) =P(Y 4N+1=2N)=1:Since X is stopped at 4N+1we have X t ! X 4N+1 as t ! 1. Also,j'j 1 and 0 1, hence X is a submartingale: we also havejXj 4N. Thus, for any bounded stopping time , Doob's optionalsampling theorem givesE X E X 4N+1=4N a N4 +4Na N4 :Hence kXk 2Na N . Since 1 >a N , with =2N wehaveP (Y ) =2N>2Na N kXk:This proves that +2is the best constant.References[1] D. L. Burkholder, Sharp probability bounds for It^o processes, in: J. K. Ghosh,S. K. Mitra, K. R. Parathasarathy, and B. L. S. Prakasa Rao, eds., Statisticsand Probability: A Raghu Raj Bahadur Festschrift, Wiley, New York, 1993,pp. 135-145.[2] , Strong dierential subordination and stochastic integration, Ann.Probab. 22 (1994), 995-1025.[3] N. Ikeda and S. Watanabe, Stochastic Dierential Equations and DiusionProcesses, North-Holland, Amsterdam, 1981.[4] S. Lang, Analysis I, Addison-Wesley, Reading, Mass., 1968.Department of MathematicsKAISTTaejon 305-701, KoreaE-mail: cschoi@math.kaist.ac.kr

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!