Size: 1136
Comment:
|
Size: 1330
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 7: | Line 7: |
where RSS is the Residual Sum of Squares which is routinely outputted from the regression analysis and n is the total sample size. This formula is given on page 63 of Burnham and Anderson (2002). | where RSS is the Residual Sum of Squares which is routinely outputted from the regression analysis, n is the total sample size and df(model) is the degrees of freedom of the regression model which is the number of parameters equal to the number of predictors + 1 (for the intercept). The above formula for AIC is also given on page 63 of Burnham and Anderson (2002). |
Line 14: | Line 14: |
(A pdf copy may be downloaded for free [[http://www.mun.ca/biology/quant/ModelSelectionMultimodelInference.pdf| from here.]]) | (A pdf copy of the above book may also be downloaded for free [[http://www.mun.ca/biology/quant/ModelSelectionMultimodelInference.pdf| from here.]]) |
How do I compute Akaike's information criterion (AIC) to compare regression models?
Akaike's information criterion is used to compare both the efficiency of multivariate models looking at the same data combining the degree of fit with the number of terms in the model. Better fitting simpler models are preferred with smaller AICs. AIC can be used as an alternative to the F ratio in stepwise regressions investigating the effectiveness of adding or subtracting one or more predictors from a model (see an example in the Regression Grad talk).
AIC = n ln(RSS/n) + 2 df(model)
where RSS is the Residual Sum of Squares which is routinely outputted from the regression analysis, n is the total sample size and df(model) is the degrees of freedom of the regression model which is the number of parameters equal to the number of predictors + 1 (for the intercept). The above formula for AIC is also given on page 63 of Burnham and Anderson (2002).
Reference
(A pdf copy of the above book may also be downloaded for free from here.)