10.07.2015 Views

The np Package - NexTag Supports Open Source Initiatives

The np Package - NexTag Supports Open Source Initiatives

The np Package - NexTag Supports Open Source Initiatives

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>np</strong>indexbw 121methodnmultithe single index model method, one of either “ichimura” (default) (Ichimura(1993)) or “kleinspady” (Klein and Spady (1993)). Defaults to ichimura.integer number of times to restart the process of finding extrema of the crossvalidationfunction from different (random) initial points. Defaults to min(5,ncol(xdat)).random.seed an integer used to seed R’s random number generator. This ensures replicabilityof the numerical search. Defaults to 42.bandwidth.computea logical value which specifies whether to do a numerical search for bandwidthsor not. If set to FALSE, a bandwidth object will be returned with bandwidthsset to those specified in bws. Defaults to TRUE.optim.method method used by optim for minimization of the objective function. See ?optimfor references. Defaults to "Nelder-Mead".the default method is an implementation of that of Nelder and Mead (1965),that uses only function values and is robust but relatively slow. It will workreasonably well for non-differentiable functions.method "BFGS" is a quasi-Newton method (also known as a variable metricalgorithm), specifically that published simultaneously in 1970 by Broyden,Fletcher, Goldfarb and Shanno. This uses function values and gradients to buildup a picture of the surface to be optimized.method "CG" is a conjugate gradients method based on that by Fletcher andReeves (1964) (but with the option of Polak-Ribiere or Beale-Sorenson updates).Conjugate gradient methods will generally be more fragile than theBFGS method, but as they do not store a matrix they may be successful in muchlarger optimization problems.optim.maxattemptsmaximum number of attempts taken trying to achieve successful convergence inoptim. Defaults to 100.Detailsoptim.abstol the absolute convergence tolerance used by optim. Only useful for non-negativefunctions, as a tolerance for reaching zero. Defaults to .Machine$double.eps.optim.reltol relative convergence tolerance used by optim. <strong>The</strong> algorithm stops if it is unableto reduce the value by a factor of ’reltol * (abs(val) + reltol)’ at a step.Defaults to sqrt(.Machine$double.eps), typically about 1e-8.optim.maxit maximum number of iterations used by optim. Defaults to 500.... additional arguments supplied to specify the parameters to the sibandwidthS3 method, which is called during the numerical search.We implement Ichimura’s (1993) method via joint estimation of the bandwidth and coefficient vectorusing leave-one-out nonlinear least squares. We implement Klein and Spady’s (1993) methodmaximizing the leave-one-out log likelihood function jointly with respect to the bandwidth and coefficientvector. Note that Klein and Spady’s (1993) method is for binary outcomes only, whileIchimura’s (1993) method can be applied for any outcome datatype (i.e., continuous or discrete).We impose the identification condition that the first element of the coefficient vector beta is equal toone, while identification also requires that the explanatory variables contain at least one continuousvariable.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!