24.02.2013 Views

Optimality

Optimality

Optimality

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

P value EDF<br />

qdf<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

0.0 0.4 0.8 1.2 1.6 2.0<br />

| ||| | | |<br />

interior knot positions<br />

Massive multiple hypotheses testing 61<br />

(a)<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

(c)<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

u<br />

P value EQF<br />

EQF, Q^, and conv. backbone<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

(b)<br />

||| | | | | | | | |<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

t* positions<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

Fig 1. (a) The interior knot positions indicated by | and the P value EDF; (b) the positions of t ∗ j<br />

indicated by | and the P value EQF; (c) �qm: the derivative of �Qm; (d) the P value EQF (solid),<br />

the smoothed EQF �Qm from Algorithm 1 (dash-dot), and the convex backbone �Q ∗ m (long dash).<br />

a measurement that is an L 2 distance. The concept of convex backbone facilitates<br />

the derivation of a measurement more adaptive to the ensemble distribution of the<br />

P values. Given the convex backbone Q ∗ m(·) := Q ∗ m(·;γ, a, d, b1, b0, τm) as defined<br />

in (3.2), the “model-fitting” term can be defined as the L γ distance between Q ∗ m(·)<br />

and uniformity on [0, α]:<br />

Dγ(α) :=<br />

�� α<br />

0<br />

0.0 0.2 0.4 0.6 0.8 1.0<br />

(t−Q ∗ m(t)) γ �1/γ dt , α∈(0,1].<br />

The adaptivity is reflected by the use of the L γ distance: Recall that the larger<br />

the γ, the higher concentration of small P values, and the norm inequality (Hardy,<br />

Littlewood, and Pólya [16], P.157) implies that Dγ2(α)≥Dγ1(α) for every α∈(0,1]<br />

if γ2 > γ1.<br />

Clearly Dγ(α) is non-decreasing in α. Intuitively one possibility would be to<br />

maximize a criterion like Dγ(α)−λπ0mα. However, the two terms are not on the<br />

(d)<br />

u

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!