44 <strong>np</strong>udensbwreminitmaxftoltolsmalla logical value which when set as TRUE the search routine restarts from locatedminima for a minor gain in accuracy. Defaults to TRUE.integer number of iterations before failure in the numerical optimization routine.Defaults to 10000.tolerance on the value of the cross-validation function evaluated at located minima.Defaults to 1.19e-07 (FLT_EPSILON).tolerance on the position of located minima of the cross-validation function.Defaults to 1.49e-08 (sqrt(DBL_EPSILON)).a small number, at about the precision of the data type used. Defaults to 2.22e-16 (DBL_EPSILON).DetailsValue<strong>np</strong>udensbw implements a variety of methods for choosing bandwidths for multivariate (p-variate)distributions defined over a set of possibly continuous and/or discrete (unordered, ordered) data.<strong>The</strong> approach is based on Li and Racine (2003) who employ ‘generalized product kernels’ thatadmit a mix of continuous and discrete datatypes.<strong>The</strong> cross-validation methods employ multivariate numerical search algorithms (direction set (Powell’s)methods in multidimensions).Bandwidths can (and will) differ for each variable which is, of course, desirable.Three classes of kernel estimators for the continuous datatypes are available: fixed, adaptive nearestneighbor,and generalized nearest-neighbor. Adaptive nearest-neighbor bandwidths change witheach sample realization in the set, x i , when estimating the density at the point x. Generalizednearest-neighbor bandwidths change with the point at which the density is estimated, x. Fixedbandwidths are constant over the support of x.<strong>np</strong>udensbw may be invoked either with a formula-like symbolic description of variables on whichbandwidth selection is to be performed or through a simpler interface whereby data is passed directlyto the function via the dat parameter. Use of these two interfaces is mutually exclusive.Data contained in the data frame dat may be a mix of continuous (default), unordered discrete(to be specified in the data frame dat using factor), and ordered discrete (to be specified in thedata frame dat using ordered). Data can be entered in an arbitrary order and data types will bedetected automatically by the routine (see <strong>np</strong> for details).Data for which bandwidths are to be estimated may be specified symbolically. A typical descriptionhas the form ~ data, where data is a series of variables specified by name, separated by theseparation character ’+’. For example, ~ x + y specifies that the bandwidths for the jointdistribution of variables x and y are to be estimated. See below for further examples.A variety of kernels may be specified by the user. Kernels implemented for continuous datatypesinclude the second, fourth, sixth, and eighth order Gaussian and Epanechnikov kernels, and theuniform kernel. Unordered discrete datatypes use a variation on Aitchison and Aitken’s (1976)kernel, while ordered datatypes use a variation of the Wang and van Ryzin (1981) kernel.<strong>np</strong>udensbw returns a bandwidth object, with the following components:bwbandwidth(s), scale factor(s) or nearest neighbours for the data, dat
<strong>np</strong>udensbw 45fvalobjective function value at minimumif bwtype is set to fixed, an object containing bandwidths, of class bandwidth (or scale factorsif bwscaling = TRUE) is returned. If it is set to generalized_nn or adaptive_nn, theninstead the kth nearest neighbors are returned for the continuous variables while the discrete kernelbandwidths are returned for the discrete variables. Bandwidths are stored under the componentname bw, with each element i corresponding to column i of i<strong>np</strong>ut data dat.<strong>The</strong> functions summary and plot support objects of type bandwidth.Usage IssuesIf you are using data of mixed types, then it is advisable to use the data.frame function toconstruct your i<strong>np</strong>ut data and not cbind, since cbind will typically not work as intended onmixed data types and will coerce the data to the same type.Caution: multivariate data-driven bandwidth selection methods are, by their nature, computationallyintensive. Virtually all methods require dropping the ith observation from the data set, computingan object, repeating this for all observations in the sample, then averaging each of these leave-oneoutestimates for a given value of the bandwidth vector, and only then repeating this a large numberof times in order to conduct multivariate numerical minimization/maximization. Furthermore, dueto the potential for local minima/maxima, restarting this procedure a large number of times mayoften be necessary. This can be frustrating for users possessing large datasets. For exploratorypurposes, you may wish to override the default search tolerances, say, setting ftol=.01 and tol=.01and conduct multistarting (the default is to restart min(5, ncol(dat)) times) as is done for a numberof examples. Once the procedure terminates, you can restart search with default tolerances usingthose bandwidths obtained from the less rigorous search (i.e., set bws=bw on subsequent calls tothis routine where bw is the initial bandwidth object). A version of this package using the Rmpiwrapper is under development that allows one to deploy this software in a clustered computingenvironment to facilitate computation involving large datasets.Author(s)Tristen Hayfield 〈hayfield@phys.ethz.ch〉, Jeffrey S. Racine 〈racinej@mcmaster.ca〉ReferencesAitchison, J. and , C.G.G. Aitken (1976), “Multivariate binary discrimination by the kernel method,”Biometrika, 63, 413-420.Li, Q. and J.S. Racine (2007), No<strong>np</strong>arametric Econometrics: <strong>The</strong>ory and Practice, Princeton UniversityPress.Li, Q. and J.S. Racine (2003), “No<strong>np</strong>arametric estimation of distributions with categorical andcontinuous data,” Journal of Multivariate Analysis, 86, 266-292.Ouyang, D. and Q. Li and J.S. Racine (2006), “Cross-validation and the estimation of probabilitydistributions with categorical data,” Journal of No<strong>np</strong>arametric Statistics, 18, 69-100.Pagan, A. and A. Ullah (1999), No<strong>np</strong>arametric Econometrics, Cambridge University Press.Scott, D.W. (1992), Multivariate Density Estimation. <strong>The</strong>ory, Practice and Visualization, NewYork: Wiley.Silverman, B.W. (1986), Density Estimation, London: Chapman and Hall.
- Page 1 and 2: The np PackageFebruary 16, 2008Vers
- Page 3 and 4: Italy 3Examplesdata("cps71")attach(
- Page 5 and 6: wage1 5Examplesdata("oecdpanel")att
- Page 7 and 8: gradients 7## S3 method for class '
- Page 9 and 10: np 9A variety of bandwidth methods
- Page 11 and 12: npcmstest 11npcmstestKernel Consist
- Page 13 and 14: npcmstest 13ReferencesAitchison, J.
- Page 15 and 16: npcdens 15npcdensKernel Conditional
- Page 17 and 18: npcdens 17Valuenpcdens returns a co
- Page 19 and 20: npcdens 19# Gaussian kernel (defaul
- Page 21 and 22: npcdens 21# (1993) (see their descr
- Page 23 and 24: npcdensbw 23fit
- Page 25 and 26: npcdensbw 25na.actionxdatydatbwspro
- Page 27 and 28: npcdensbw 27data. The approach is b
- Page 29 and 30: npconmode 29# depending on the spee
- Page 31 and 32: npconmode 31tydatexdateydata one (1
- Page 33 and 34: npconmode 33lwt,family=binomial(lin
- Page 35 and 36: npudens 35npudensKernel Density and
- Page 37 and 38: npudens 37Author(s)Tristen Hayfield
- Page 39 and 40: npudens 39# EXAMPLE 1 (INTERFACE=DA
- Page 41 and 42: npudensbw 41library("datasets")data
- Page 43: npudensbw 43bwsa bandwidth specific
- Page 47 and 48: npudensbw 47# previous examples.bw
- Page 49 and 50: npudensbw 49# previous examples.bw
- Page 51 and 52: npksum 51Argumentsformuladatanewdat
- Page 53 and 54: npksum 53Usage IssuesIf you are usi
- Page 55 and 56: npksum 55# the bandwidth object its
- Page 57 and 58: npksum 57ss
- Page 59 and 60: npplot 59plot.behavior = c("plot","
- Page 61 and 62: npplot 61xtrim = 0.0,neval = 50,com
- Page 63 and 64: npplot 63xdatydatzdatxqyqzqxtrimytr
- Page 65 and 66: npplot 65DetailsValuenpplot is a ge
- Page 67 and 68: npplot 67year.seq
- Page 69 and 70: npplot 69# npplot(). When npplot()
- Page 71 and 72: npplreg 71## S3 method for class 'c
- Page 73 and 74: npplreg 73residR2MSEMAEMAPECORRSIGN
- Page 75 and 76: npplreg 75# Plot the regression sur
- Page 77 and 78: npplregbw 77and dependent data), an
- Page 79 and 80: npplregbw 79Detailsnpplregbw implem
- Page 81 and 82: npplregbw 81x2
- Page 83 and 84: npqcmstest 83npqcmstestKernel Consi
- Page 85 and 86: npqcmstest 85Author(s)Tristen Hayfi
- Page 87 and 88: npqreg 87ftol = 1.19209e-07,tol = 1
- Page 89 and 90: npqreg 89Li, Q. and J.S. Racine (20
- Page 91 and 92: npreg 91Usagenpreg(bws, ...)## S3 m
- Page 93 and 94: npreg 93residR2MSEMAEMAPECORRSIGNif
- Page 95 and 96:
npreg 95summary(model)# Use npplot(
- Page 97 and 98:
npreg 97# - this may take a few min
- Page 99 and 100:
npreg 99# then a noisy samplen
- Page 101 and 102:
npregbw 101## S3 method for class '
- Page 103 and 104:
npregbw 103ckerorderukertypeokertyp
- Page 105 and 106:
npregbw 105ReferencesAitchison, J.
- Page 107 and 108:
npsigtest 107bw
- Page 109 and 110:
npsigtest 109Author(s)Tristen Hayfi
- Page 111 and 112:
npindex 111Usagenpindex(bws, ...)##
- Page 113 and 114:
npindex 113MAEMAPECORRSIGNif method
- Page 115 and 116:
npindex 115x2
- Page 117 and 118:
npindex 117# x1 is chi-squared havi
- Page 119 and 120:
npindexbw 119# plotting via persp()
- Page 121 and 122:
npindexbw 121methodnmultithe single
- Page 123 and 124:
npindexbw 123allows one to deploy t
- Page 125 and 126:
npscoef 125x1
- Page 127 and 128:
npscoef 127Valueeydatezdaterrorsres
- Page 129 and 130:
npscoef 129# We could manually plot
- Page 131 and 132:
npscoefbw 131optim.abstol,optim.max
- Page 133 and 134:
npscoefbw 133optim.maxattemptsmaxim
- Page 135 and 136:
npscoefbw 135ReferencesAitchison, J
- Page 137 and 138:
uocquantile 137uocquantileCompute Q
- Page 139:
INDEX 139npconmode, 29npindex, 110n