11.07.2015 Views

Preface to First Edition - lib

Preface to First Edition - lib

Preface to First Edition - lib

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

174 RECURSIVE PARTITIONINGR> plot(glaucoma_ctree)1varip < 0.001≤ 0.059 > 0.0592vasgp < 0.0015tmsp = 0.049≤ 0.066 > 0.066≤ −0.066 > −0.066normal glaucomaNode 3 (n = 79)10.80.60.40.20normal glaucomaNode 4 (n = 8)10.80.60.40.20normal glaucomaNode 6 (n = 65)10.80.60.40.20normal glaucomaNode 7 (n = 44)10.80.60.40.20Figure 9.7Conditional inference tree with the distribution of glaucomateous eyesshown for each terminal leaf.and a graphical representation is depicted in Figure 9.7 showing both thecutpoints and the p-values of the associated independence tests for each node.The first split is performed using a cutpoint defined with respect <strong>to</strong> the volumeof the optic nerve above some reference plane, but in the inferior part of theeye only (vari).9.4 SummaryRecursive partitioning procedures are rather simple non-parametric <strong>to</strong>ols forregression modelling. The main structures of regression relationship can bevisualised in a straightforward way. However, one should bear in mind thatthe nature of those models is very simple and can serve only as a roughapproximation <strong>to</strong> reality. When multiple simple models are averaged, powerfulpredic<strong>to</strong>rs can be constructed.ExercisesEx. 9.1 Construct a regression tree for the Bos<strong>to</strong>n Housing data reported byHarrison and Rubinfeld (1978) which are available as data.frame Bos<strong>to</strong>n-Housing from package mlbench (Leisch and Dimitriadou, 2009). Comparethe predictions of the tree with the predictions obtained from randomForest.Which method is more accurate?© 2010 by Taylor and Francis Group, LLC

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!