01.03.2013 Views

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

Applied Statistics Using SPSS, STATISTICA, MATLAB and R

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

SST =<br />

c<br />

i=<br />

1 j=<br />

1<br />

= r<br />

∑∑<br />

c<br />

∑<br />

r<br />

i=<br />

1<br />

( x<br />

( x<br />

i.<br />

ij<br />

− x<br />

− x<br />

= SSC + SSR + SSE<br />

..<br />

)<br />

..<br />

2<br />

)<br />

2<br />

+ c<br />

r<br />

∑<br />

j=<br />

1<br />

.<br />

( x<br />

. j<br />

− x<br />

..<br />

)<br />

2<br />

+<br />

c<br />

r<br />

∑∑<br />

i=<br />

1 j=<br />

1<br />

( x<br />

ij<br />

− x<br />

i.<br />

− x<br />

. j<br />

+ x<br />

..<br />

)<br />

2<br />

157<br />

4.40<br />

Besides the term SST described in the previous section, the sums of squares<br />

have the following interpretation:<br />

1. SSC represents the sum of squares or dispersion along the columns, as the<br />

previous SSB. The variance along the columns is vc = SSC/(c−1), has c−1<br />

2 2<br />

degrees of freedom <strong>and</strong> is the point estimate of σ + rσ c .<br />

2. SSR represents the dispersion along the rows, i.e., is the row version of the<br />

previous SSB. The variance along the rows is vr = SSR/(r−1), has r−1<br />

2 2<br />

degrees of freedom <strong>and</strong> is the point estimate of σ + cσ r .<br />

3. SSE represents the residual dispersion or experimental error. The<br />

experimental variance associated to the r<strong>and</strong>omness of the experiment is<br />

ve = SSE / [(c−1)(r−1)], has (c−1)(r−1) degrees of freedom <strong>and</strong> is the point<br />

2<br />

estimate of σ .<br />

Note that formula 4.40 can only be obtained when c <strong>and</strong> r are constant along the<br />

rows <strong>and</strong> along the columns, respectively. This corresponds to the so-called<br />

orthogonal experiment.<br />

In the situation shown in Table 4.19, it is possible to consider every cell value as<br />

a r<strong>and</strong>om case from a population with mean µij, such that:<br />

c<br />

µij = µ + µi. + µ.j , with ∑ µ i.<br />

= 0 <strong>and</strong> ∑ µ . j = 0 , 4.41<br />

i=<br />

1<br />

j=<br />

1<br />

i.e., the mean of the population corresponding to cell ij is obtained by adding to a<br />

global mean µ the means along the columns <strong>and</strong> along the rows. The sum of the<br />

means along the columns as well as the sum of the means along the rows, is zero.<br />

Therefore, when computing the mean of all cells we obtain the global mean µ. It is<br />

assumed that the variance for all cell populations is σ 2 .<br />

In this single observation, additive effects model, one can, therefore, treat the<br />

effects along the columns <strong>and</strong> along the rows independently, testing the following<br />

null hypotheses:<br />

H01: There are no column effects, µi. = 0.<br />

H02: There are no row effects, µ.j = 0.<br />

4.5 Inference on More than Two Populations<br />

The null hypothesis H01 is tested using the ratio vc/ve, which, under the<br />

assumptions of independent sampling on normal distributions <strong>and</strong> with equal<br />

r

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!