13.07.2015 Views

THÈSE DE DOCTORAT DE L'UNIVERSITÉ PARIS 6 Spécialité ...

THÈSE DE DOCTORAT DE L'UNIVERSITÉ PARIS 6 Spécialité ...

THÈSE DE DOCTORAT DE L'UNIVERSITÉ PARIS 6 Spécialité ...

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2.2. The main result and the estimator 23and that the exact constant depends on f β (0). For 0 < β ≤ 1, the solution of (2.6)is known (see Korostelev (1993) or Donoho (1994a)) and the kernel used by Korostelevdefined in (2.5) is equal to that defined in (2.7) up to a renormalization on the support.However the function f β is not known for β > 1, except for β = 2. Korostelev andNussbaum (1999) have found the exact constant and asymptotically exact estimator forthe density model in sup-norm. Lepski (1992) has studied the exact constant in the caseof adaptation for the white noise model. The sup-norm estimation is only one of theapproaches studied in the nonparametric literature. For the L 2 -norm risk, one can findoverview of results on exact minimax and adaptive estimation in the books of Efromovich(1999) and Tsybakov (2004)Our results are the following. In Section 2.2, we give an asymptotically exactestimator θn ∗ and the exact constant for the regression model with random design. Ifthe density µ is uniform (µ 0 = 1), then the constant is equal to w(C 0 ) (the constant ofKorostelev (1993)). As it could be expected, the exact constant and the asymptoticallyexact estimator θn ∗ depend on the minimum value of the design density µ 0 . It means thatthe asymptotically minimax estimators contribute to the sup-norm risk essentially at thepoints where we have less observations. The estimator θn ∗ that is proposed in Section 2.2is close to a Nadaraya-Watson estimator and is independent of Q. The proofs are givenin Section 2.3.2.2 The main result and the estimatorIn this section, we define an estimator θ ∗ n. We shall prove in Subsection 2.3.1 that θ ∗ n is anasymptotically exact estimator. This estimator is close to a Nadaraya-Watson estimatorwith the kernel K defined in (2.5). The bandwidth of θ ∗ n ish =( ) 1C′0 ψβ n ,LwithC ′ 0 =(σ 2β L( ) ) 1β 2β+1β + 1.2β 2 µ 0First let us define θ ∗ n in a regular grid of points x k = km n∈ [0, 1] for k ∈ {1, . . . , [ n ]}, withmm = [δ n nψ 1 βn + 1], δ n = 1 and [x] denotes the integer part of x. To account for thelog nboundary effects, we need to introduce other kernels:K 1 (t) = 2K(t)I [0,1] (t), K 2 (t) = 2K(t)I [−1,0] (t) for t ∈ R.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!