52 <strong>np</strong>ksumeven new kernel estimators. <strong>The</strong> convolution kernel option would allow you to create, say, the leastsquares cross-validation function for kernel density estimation.<strong>np</strong>ksum uses highly-optimized C code that strives to minimize its ‘memory footprint’, while thereis low overhead involved when using repeated calls to this function (see, by way of illustration, theexample below that conducts leave-one-out cross-validation for a local constant regression estimatorvia calls to the R function nlm, and compares this to the <strong>np</strong>regbw function).<strong>np</strong>ksum implements a variety of methods for computing multivariate kernel sums (p-variate) definedover a set of possibly continuous and/or discrete (unordered, ordered) data. <strong>The</strong> approachis based on Li and Racine (2003) who employ ‘generalized product kernels’ that admit a mix ofcontinuous and discrete datatypes.Three classes of kernel estimators for the continuous datatypes are available: fixed, adaptive nearestneighbor,and generalized nearest-neighbor. Adaptive nearest-neighbor bandwidths change witheach sample realization in the set, x i , when estimating the kernel sum at the point x. Generalizednearest-neighbor bandwidths change with the point at which the sum is computed, x. Fixedbandwidths are constant over the support of x.<strong>np</strong>ksum computes ∑ nj=1 W ′ j Y jK(X j ), where A j represents a row vector extracted from A. Thatis, it computes the kernel weighted sum of the outer product of the rows of W and Y . In theexamples, the uses of such sums are illustrated.<strong>np</strong>ksum may be invoked either with a formula-like symbolic description of variables on which thesum is to be performed or through a simpler interface whereby data is passed directly to the functionvia the txdat and tydat parameters. Use of these two interfaces is mutually exclusive.Data contained in the data frame txdat (and also exdat) may be a mix of continuous (default),unordered discrete (to be specified in the data frame txdat using the factor command), andordered discrete (to be specified in the data frame txdat using the ordered command). Datacan be entered in an arbitrary order and data types will be detected automatically by the routine (see<strong>np</strong> for details).Data for which bandwidths are to be estimated may be specified symbolically. A typical descriptionhas the form dependent data ~ explanatory data, where dependent dataand explanatory data are both series of variables specified by name, separated by the separationcharacter ’+’. For example, y1 ~ x1 + x2 specifies that y1 is to be kernel-weightedby x1 and x2 throughout the sum. See below for further examples.A variety of kernels may be specified by the user. Kernels implemented for continuous datatypesinclude the second, fourth, sixth, and eighth order Gaussian and Epanechnikov kernels, and theuniform kernel. Unordered discrete datatypes use a variation on Aitchison and Aitken’s (1976)kernel, while ordered datatypes use a variation of the Wang and van Ryzin (1981) kernel (see <strong>np</strong>for details).Value<strong>np</strong>ksum returns a <strong>np</strong>kernelsum object with the following components:evalksumthe evaluation pointsthe sum at the evaluation points
<strong>np</strong>ksum 53Usage IssuesIf you are using data of mixed types, then it is advisable to use the data.frame function toconstruct your i<strong>np</strong>ut data and not cbind, since cbind will typically not work as intended onmixed data types and will coerce the data to the same type.Author(s)Tristen Hayfield 〈hayfield@phys.ethz.ch〉, Jeffrey S. Racine 〈racinej@mcmaster.ca〉ReferencesAitchison, J. and C.G.G. Aitken (1976), “ Multivariate binary discrimination by the kernel method,”Biometrika, 63, 413-420.Li, Q. and J.S. Racine (2007), No<strong>np</strong>arametric Econometrics: <strong>The</strong>ory and Practice, Princeton UniversityPress.Li, Q. and J.S. Racine (2003), “No<strong>np</strong>arametric estimation of distributions with categorical andcontinuous data,” Journal of Multivariate Analysis, 86, 266-292.Pagan, A. and A. Ullah (1999), No<strong>np</strong>arametric Econometrics, Cambridge University Press.Scott, D.W. (1992), Multivariate Density Estimation. <strong>The</strong>ory, Practice and Visualization, NewYork: Wiley.Silverman, B.W. (1986), Density Estimation, London: Chapman and Hall.Wang, M.C. and J. van Ryzin (1981), “A class of smooth estimators for discrete distributions,”Biometrika, 68, 301-309.Examples# EXAMPLE 1: For this example, we generate 100,000 observations from a# N(0, 1) distribution, then evaluate the kernel density on a grid of 50# equally spaced points using the <strong>np</strong>ksum() function, then compare# results with the (identical) <strong>np</strong>udens() function outputn
- Page 1 and 2: The np PackageFebruary 16, 2008Vers
- Page 3 and 4: Italy 3Examplesdata("cps71")attach(
- Page 5 and 6: wage1 5Examplesdata("oecdpanel")att
- Page 7 and 8: gradients 7## S3 method for class '
- Page 9 and 10: np 9A variety of bandwidth methods
- Page 11 and 12: npcmstest 11npcmstestKernel Consist
- Page 13 and 14: npcmstest 13ReferencesAitchison, J.
- Page 15 and 16: npcdens 15npcdensKernel Conditional
- Page 17 and 18: npcdens 17Valuenpcdens returns a co
- Page 19 and 20: npcdens 19# Gaussian kernel (defaul
- Page 21 and 22: npcdens 21# (1993) (see their descr
- Page 23 and 24: npcdensbw 23fit
- Page 25 and 26: npcdensbw 25na.actionxdatydatbwspro
- Page 27 and 28: npcdensbw 27data. The approach is b
- Page 29 and 30: npconmode 29# depending on the spee
- Page 31 and 32: npconmode 31tydatexdateydata one (1
- Page 33 and 34: npconmode 33lwt,family=binomial(lin
- Page 35 and 36: npudens 35npudensKernel Density and
- Page 37 and 38: npudens 37Author(s)Tristen Hayfield
- Page 39 and 40: npudens 39# EXAMPLE 1 (INTERFACE=DA
- Page 41 and 42: npudensbw 41library("datasets")data
- Page 43 and 44: npudensbw 43bwsa bandwidth specific
- Page 45 and 46: npudensbw 45fvalobjective function
- Page 47 and 48: npudensbw 47# previous examples.bw
- Page 49 and 50: npudensbw 49# previous examples.bw
- Page 51: npksum 51Argumentsformuladatanewdat
- Page 55 and 56: npksum 55# the bandwidth object its
- Page 57 and 58: npksum 57ss
- Page 59 and 60: npplot 59plot.behavior = c("plot","
- Page 61 and 62: npplot 61xtrim = 0.0,neval = 50,com
- Page 63 and 64: npplot 63xdatydatzdatxqyqzqxtrimytr
- Page 65 and 66: npplot 65DetailsValuenpplot is a ge
- Page 67 and 68: npplot 67year.seq
- Page 69 and 70: npplot 69# npplot(). When npplot()
- Page 71 and 72: npplreg 71## S3 method for class 'c
- Page 73 and 74: npplreg 73residR2MSEMAEMAPECORRSIGN
- Page 75 and 76: npplreg 75# Plot the regression sur
- Page 77 and 78: npplregbw 77and dependent data), an
- Page 79 and 80: npplregbw 79Detailsnpplregbw implem
- Page 81 and 82: npplregbw 81x2
- Page 83 and 84: npqcmstest 83npqcmstestKernel Consi
- Page 85 and 86: npqcmstest 85Author(s)Tristen Hayfi
- Page 87 and 88: npqreg 87ftol = 1.19209e-07,tol = 1
- Page 89 and 90: npqreg 89Li, Q. and J.S. Racine (20
- Page 91 and 92: npreg 91Usagenpreg(bws, ...)## S3 m
- Page 93 and 94: npreg 93residR2MSEMAEMAPECORRSIGNif
- Page 95 and 96: npreg 95summary(model)# Use npplot(
- Page 97 and 98: npreg 97# - this may take a few min
- Page 99 and 100: npreg 99# then a noisy samplen
- Page 101 and 102: npregbw 101## S3 method for class '
- Page 103 and 104:
npregbw 103ckerorderukertypeokertyp
- Page 105 and 106:
npregbw 105ReferencesAitchison, J.
- Page 107 and 108:
npsigtest 107bw
- Page 109 and 110:
npsigtest 109Author(s)Tristen Hayfi
- Page 111 and 112:
npindex 111Usagenpindex(bws, ...)##
- Page 113 and 114:
npindex 113MAEMAPECORRSIGNif method
- Page 115 and 116:
npindex 115x2
- Page 117 and 118:
npindex 117# x1 is chi-squared havi
- Page 119 and 120:
npindexbw 119# plotting via persp()
- Page 121 and 122:
npindexbw 121methodnmultithe single
- Page 123 and 124:
npindexbw 123allows one to deploy t
- Page 125 and 126:
npscoef 125x1
- Page 127 and 128:
npscoef 127Valueeydatezdaterrorsres
- Page 129 and 130:
npscoef 129# We could manually plot
- Page 131 and 132:
npscoefbw 131optim.abstol,optim.max
- Page 133 and 134:
npscoefbw 133optim.maxattemptsmaxim
- Page 135 and 136:
npscoefbw 135ReferencesAitchison, J
- Page 137 and 138:
uocquantile 137uocquantileCompute Q
- Page 139:
INDEX 139npconmode, 29npindex, 110n