= Matrix algebra derivation of Sums of Squares = || ||||<45%> || ||<17%>$$Z^{T}$$ = ||<38%> 1,...,1||<45%> $$z_text{1}, … ,z_text{N}$$|| || ||<38%> ||<45%> || ||<17%>$$Z^{T}Z =$$ ||<38%> N||<45%> $$\sum_text{i}z_text{i}$$ || || ||<38%> $$\sum_text{i}z_text{i}$$ ||<45%> $$\sum_text{i}z_text{i}^text{2}$$ || || ||<38%> ||<45%> || ||<17%>$$(Z^{T}Z)^{-1} =$$ ||<38%> $$\frac{\sum_{i}z_{i}^{2}}{N\sum_text{i}z_{i}^{2}} $$||<45%> 0 || || ||<38%> 0 ||<45%> $$\frac{N}{N\sum_text{i}z_{i}^{2}}$$ || which may be more simply as expressed as || ||<38%> ||<45%> || ||<17%>$$(Z^{T}Z)^{-1} =$$ ||<38%> $$\frac{1}{N} $$||<45%> 0 || || ||<38%> 0 ||<45%> $$\frac{1}{\sum_text{i}z_{i}^{2}}$$ || || ||<38%> ||<45%> || ||<17%>$$Z^{T}Y =$$ ||<38%> $$\sum_text{i} y_text{i}$$ ||<45%> $$\sum_text{i}z_text{i}y_text{i}$$ || Then the regression terms are obtained using the least squares estimate B = $$(Z^text{T}Z)^text{-1}Z^text{T}Y$$. Two terms are required: in the regression of the standardised covariate on the difference in a pair of response level means we require the regression estimate of the intercept (W1) and the covariate (W1 x covariate). For the intercept using the above B = average difference between levels of W1, call this $$\bar{y}$$ For the W1 x covariate interaction B = $$\frac{\sum_text{i}z_text{i}y_text{i}}{\sum_text{i}z_{i}^{2}}$$ Taking SS explained by the regression equal to B $$Z^text{T}Z$$ B (see e.g., Rao, Toutenburg, Shalabh and Heumann(2007)) and using the appropriate Bs above and the diagonal entries in $$Z\text^{T}Z$$ this gives W1 SS of $$\bar{y}\mbox{ x }\bar{y}$$ N and $$(\frac{\sum_text{i}z_text{i}y_text{i}}{\sum_text{i}z_{i}^{2}})^text{2}$$ x $$\sum_text{i}z_text{i}^text{2}$$ = $$\frac{(\sum_text{i}z_text{i}y_text{i})^text{2}}{\sum_text{i}z_{i}^{2}}$$ for the W1 x covariate SS. __Reference__ Rao CR, Toutenberg H, Shalabh, and Heumann C (2007). Linear models and generalizations: least squares and alternatives. Third Edition. Springer-Verlag:Berlin.