31.05.2014 Views

Processus de Lévy en Finance - Laboratoire de Probabilités et ...

Processus de Lévy en Finance - Laboratoire de Probabilités et ...

Processus de Lévy en Finance - Laboratoire de Probabilités et ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

92 CHAPTER 2. THE CALIBRATION PROBLEM<br />

and 2.10 and Prohorov’s theorem, {Q n } n≥1 is weakly relatively compact, which proves the first<br />

part of the theorem.<br />

Choose any subsequ<strong>en</strong>ce of {Q n } n≥1 , converging weakly to a process Q ∗ ∈ M ∩ L + B . , To<br />

simplify notation, this subsequ<strong>en</strong>ce is <strong>de</strong>noted again by {Q n } n≥1 . The triangle inequality and<br />

Lemma 2.2 imply that<br />

‖C Qn − C n M‖ 2 −−−→<br />

n→∞ ‖CQ∗ − C M ‖ 2 (2.33)<br />

Since, by Lemma 2.11, the relative <strong>en</strong>tropy functional is weakly lower semicontinuous in Q,<br />

for every Q ∈ M ∩ L + B ,<br />

‖C Q∗ − C M ‖ + αI(Q|P ) ≤ lim inf{‖C Qn − C n<br />

n<br />

M‖ 2 + αI(Q n |P )}<br />

≤ lim inf{‖C Q − C n<br />

n<br />

M‖ 2 + αI(Q|P )}<br />

= lim<br />

n<br />

‖C Q − C n M‖ 2 + αI(Q|P )<br />

= ‖C Q − C M ‖ 2 + αI(Q|P ),<br />

where the second inequality follows from the fact that Q m is the solution of the calibration<br />

problem with data C m M<br />

and the last line follows from the triangle inequality.<br />

2.5.2 Converg<strong>en</strong>ce of regularized solutions<br />

In this section we study the converg<strong>en</strong>ce of solutions of the regularized calibration problem<br />

(2.27) to the solutions of the minimum <strong>en</strong>tropy least squares calibration problem (2.11) wh<strong>en</strong><br />

the noise level in the data t<strong>en</strong>ds to zero.<br />

Theorem 2.17. L<strong>et</strong> {C δ M } be a family of data s<strong>et</strong>s of option prices such that ‖C M − C δ M ‖ ≤ δ,<br />

l<strong>et</strong> P ∈ L NA ∩ L + B and suppose that there exist a solution Q of problem (2.4) with data C M (a<br />

least squares solution) such that I(Q|P ) < ∞.<br />

and<br />

If ‖C Q − C M ‖ = 0 (the constraints are reproduced exactly), l<strong>et</strong> α(δ) be such that α(δ) → 0<br />

δ2<br />

δ<br />

α(δ)<br />

→ 0 as δ → 0. Otherwise, l<strong>et</strong> α(δ) be such that α(δ) → 0 and<br />

Th<strong>en</strong> every sequ<strong>en</strong>ce {Q δ k}, where δ k → 0 and Q δ k<br />

α(δ)<br />

→ 0 as δ → 0.<br />

is a solution of problem (2.27) with data<br />

C δ k<br />

M , prior P and regularization param<strong>et</strong>er α(δ k), has a weakly converg<strong>en</strong>t subsequ<strong>en</strong>ce. The<br />

limit of every converg<strong>en</strong>t subsequ<strong>en</strong>ce is a solution of problem (2.11) (MELSS) with data C M<br />

and prior P . If such a MELSS Q + is unique th<strong>en</strong> lim δ→0 Q δ = Q + .

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!