14.03.2014 Views

Modeling and Multivariate Methods - SAS

Modeling and Multivariate Methods - SAS

Modeling and Multivariate Methods - SAS

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 3 Fitting St<strong>and</strong>ard Least Squares Models 107<br />

Effect Options<br />

Table 3.11 Description of Effect Options (Continued)<br />

LSMeans Tukey HSD<br />

LSMeans Dunnett<br />

Test Slices<br />

Power Analysis<br />

Shows a test that is sized for all differences among the least squares means.<br />

This is the Tukey or Tukey-Kramer HSD (Honestly Significant Difference)<br />

test. (Tukey 1953, Kramer 1956). This test is an exact alpha-level test if the<br />

sample sizes are the same <strong>and</strong> conservative if the sample sizes are different<br />

(Hayter 1984).<br />

See the Basic Analysis <strong>and</strong> Graphing book.<br />

Performs multiple comparisons of each level against a control level. This test<br />

is called Dunnett’s test. In the Fit Model platform, Hsu’s factor analytical<br />

approximation is used for the calculation of p-values <strong>and</strong> confidence<br />

intervals. The test results are also plotted.<br />

Note: This option is available only for interaction effects.<br />

For each level of each classification column in the interaction, it makes<br />

comparisons among all the levels of the other classification columns in the<br />

interaction. For example, if an interaction is A*B*C, then there is a slice<br />

called A=1, which tests all the B*C levels when A=1. There is another slice<br />

called A=2, <strong>and</strong> so on, for all the levels of B, <strong>and</strong> C. This is a way to detect<br />

the importance of levels inside an interaction.<br />

Shows the Power Details report, which enables you to analyze the power for<br />

the effect test. For details, see “Parameter Power” on page 75.<br />

LSMeans Table<br />

Least squares means are predicted values from the specified model across the levels of a categorical effect<br />

where the other model factors are controlled by being set to neutral values. The neutral values are the sample<br />

means (possibly weighted) for regressors with interval values, <strong>and</strong> the average coefficient over the levels for<br />

unrelated nominal effects.<br />

Least squares means are the values that let you see which levels produce higher or lower responses, holding<br />

the other variables in the model constant. Least squares means are also called adjusted means or population<br />

marginal means. Least squares means can differ from simple means when there are other effects in the<br />

model.<br />

Least squares means are the statistics that are compared when effects are tested. They might not reflect<br />

typical real-world values of the response if the values of the factors do not reflect prevalent combinations of<br />

values in the real world. Least squares means are useful as comparisons in experimental situations.<br />

Example of a Least Squares Means Table<br />

1. Open the Big Class.jmp sample data table.<br />

2. Select Analyze > Fit Model.<br />

3. Select height <strong>and</strong> click Y.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!