FAQ/matalg - CBU statistics Wiki
location: FAQ / matalg

Matrix algebra derivation of Sums of Squares

$$Z^{T}$$ =

1,...,1

$$z_text{1}, … ,z_text{N}$$

$$Z^{T}Z =$$

N

$$\sum_text{i}z_text{i}$$

$$\sum_text{i}z_text{i}$$

$$\sum_text{i}z_text{i}^text{2}$$

$$(Z{T}Z){-1} =$$

$$\frac{\sum_{i}z_{i}{2}}{N\sum_text{i}z_{i}{2}} $$

0

0

$$\frac{N}{N\sum_text{i}z_{i}^{2}}$$

which may be more simply as expressed as

$$(Z{T}Z){-1} =$$

$$\frac{1}{N} $$

0

0

$$\frac{1}{\sum_text{i}z_{i}^{2}}$$

$$Z^{T}Y =$$

$$\sum_text{i} y_text{i}$$

$$\sum_text{i}z_text{i}y_text{i}$$

Then the regression terms are obtained using the least squares estimate B = $$(Ztext{T}Z)text{-1}Z^text{T}Y$$. Two terms are required: in the regression of the standardised covariate on the difference in a pair of response level means we require the regression estimate of the intercept (W1) and the covariate (W1 x covariate).

For the intercept using the above B = average difference between levels of W1, call this $$\bar{y}$$

For the W1 x covariate interaction B = $$\frac{\sum_text{i}z_text{i}y_text{i}}{\sum_text{i}z_{i}^{2}}$$

Taking SS explained by the regression equal to B $$Ztext{T}Z$$ B (see e.g., Rao, Toutenburg, Shalabh and Heumann(2007)) and using the appropriate Bs above and the diagonal entries in $$Z\text{T}Z$$ this gives W1 SS of $$\bar{y}\mbox{ x }\bar{y}$$ N and $$(\frac{\sum_text{i}z_text{i}y_text{i}}{\sum_text{i}z_{i}{2}})text{2}$$ x $$\sum_text{i}z_text{i}text{2}$$ = $$\frac{(\sum_text{i}z_text{i}y_text{i})text{2}}{\sum_text{i}z_{i}^{2}}$$ for the W1 x covariate SS.

Reference

Rao CR, Toutenberg H, Shalabh, and Heumann C (2007). Linear models and generalizations: least squares and alternatives. Third Edition. Springer-Verlag:Berlin.

None: FAQ/matalg (last edited 2013-03-08 10:17:15 by localhost)