13.07.2015 Views

1. Introduction - Econometrics at Illinois - University of Illinois at ...

1. Introduction - Econometrics at Illinois - University of Illinois at ...

1. Introduction - Econometrics at Illinois - University of Illinois at ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Roger Koenker and Zhijie Xiao 17The foregoing results provide some basic machinery for a broad class <strong>of</strong> tests basedon the quantile regression process. In the next section we provide further detailson the implement<strong>at</strong>ion <strong>of</strong> these tests focusing most <strong>of</strong> our <strong>at</strong>tention on tests <strong>of</strong> theloc<strong>at</strong>ion shift and loc<strong>at</strong>ion-scale shift models.5. Implement<strong>at</strong>ion <strong>of</strong> the TestsGiven a general framework for inference based on the quantile regression process,we cannow elabor<strong>at</strong>e some missing details. We will begin by considering tests <strong>of</strong> theloc<strong>at</strong>ion scale shift hypothesis against a general quantile regression altern<strong>at</strong>ive. Tests<strong>of</strong> the loc<strong>at</strong>ion shift hypothesis and several variants <strong>of</strong> tests for heteroscedasticitywillthen be considered. Problems associ<strong>at</strong>ed with estim<strong>at</strong>ion <strong>of</strong> nuisance parameters aretre<strong>at</strong>ed in the nal subsection.5.<strong>1.</strong> The loc<strong>at</strong>ion-scale shift hypothesis. We would like to testF ,1y i jx i(jx i )=x > i + x > i F ,10()against the sequence <strong>of</strong> linear quantile regression altern<strong>at</strong>ivesF ,1y i jx i(jx i )= x > i n():In the simplest case the univari<strong>at</strong>e quantile function is known and we can formul<strong>at</strong>ethe hypothesis in the (4.4) not<strong>at</strong>ion,R() , r = ()by setting r i = i = i ; R = diag( ,1i), and () = p,1F ,10(): Obviously, there issome diculty if there are i equal to zero. In such cases, we can take i =1,andset the corresponding elements r i = i and i() 0. How should we go aboutestim<strong>at</strong>ing the parameters and ? Under the null hypothesis, i () = i + i F ,10() i =1;::: ;pso it is n<strong>at</strong>ural to consider linear regression. Since ^ i () is piecewise constant withjumps <strong>at</strong> points f 1 ;::: ; J ); it suces to consider p bivari<strong>at</strong>e linear regressions <strong>of</strong>^ i ( j )onf(1;F ,10( j )) : j =1;::: ;Jg: Each <strong>of</strong> these regressions has a known (asymptotic)Gaussian covariance structure th<strong>at</strong> could be used to construct a weightedleast squares estim<strong>at</strong>or, but pragm<strong>at</strong>ism might lead us to opt for the simpler unweightedestim<strong>at</strong>or. In either case we have our required O(n ,1=2 ) estim<strong>at</strong>ors ^ n and^ n .When F ,10() is(hypothetically) known the Khmaladz<strong>at</strong>ion process is rel<strong>at</strong>ivelypainless comput<strong>at</strong>ionally. The function _g(t) =(1; 0 (t); 0 (t)F ,10(t)) > is known andthe transform<strong>at</strong>ion (3.3) can be carried out by recursive least squares. Again, the discretiz<strong>at</strong>ionis based on the jumps f 1 ;::: ; J g <strong>of</strong> the piecewise constant ^() process.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!