Processus de Lévy en Finance - Laboratoire de Probabilités et ...
Processus de Lévy en Finance - Laboratoire de Probabilités et ...
Processus de Lévy en Finance - Laboratoire de Probabilités et ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
86 CHAPTER 2. THE CALIBRATION PROBLEM<br />
are within an error δ of C M , and want to construct an approximation to MELSS(C M ), the<br />
solution of problem (2.11) with the true data, it is not a good i<strong>de</strong>a to solve problem (2.11) with<br />
the noisy data C δ M<br />
because MELSS(Cδ M ) may be very far from MELSS(C M). We therefore<br />
need to regularize the problem (2.11), that is, construct a family of continuous “regularization<br />
operators” {R α } α>0 , where α is the param<strong>et</strong>er which <strong>de</strong>termines the int<strong>en</strong>sity of regularization,<br />
such that R α (CM δ ) converges to MELSS of the calibration problem as the noise level δ t<strong>en</strong>ds to<br />
zero if, for each δ, the regularization param<strong>et</strong>er α is chos<strong>en</strong> appropriately. The approximation<br />
to MELSS(C M ) using the noisy data CM δ is th<strong>en</strong> giv<strong>en</strong> by R α(CM δ ) with an appropriate choice<br />
of α.<br />
Following classical results on regularization of ill-posed problems (see [40]), we suggest to<br />
construct a regularized version of (2.11) by using the relative <strong>en</strong>tropy for p<strong>en</strong>alization rather<br />
than for selection, that is, to <strong>de</strong>fine<br />
J α (Q) = ‖C δ M − C Q ‖ 2 w + αI(Q|P ), (2.26)<br />
where α is the regularization param<strong>et</strong>er, and solve the following regularized calibration problem:<br />
Regularized calibration problem<br />
Giv<strong>en</strong> prices C M of call options, a prior Lévy process<br />
P and a regularization param<strong>et</strong>er α > 0, find Q ∗ ∈ M ∩ L, such that<br />
J α (Q ∗ ) =<br />
inf J α(Q). (2.27)<br />
Q∈M∩L<br />
Problem (2.27) can be thought of in two ways:<br />
• If the minimum <strong>en</strong>tropy least squares solution with the true data C M exists, (2.27) allows<br />
to construct a stable approximation of this solution using the noisy data.<br />
• If the MELSS with the true data does not exist, either because the s<strong>et</strong> of least squares<br />
solutions is empty or because the least squares solutions are incompatible with the prior,<br />
the regularized problem (2.27) allows to find a “compromise solution”, achieving a tra<strong>de</strong>off<br />
b<strong>et</strong>we<strong>en</strong> the pricing constraints and the prior information.<br />
In the rest of this section we study the regularized calibration problem. Un<strong>de</strong>r our standing<br />
hypothesis that the prior Lévy process has jumps boun<strong>de</strong>d from above and corresponds to