There are various model selection criteria in use for picking variables in linear regression. Some are applicable to other models outside of linear regression.
- Akaike’s Information Criterion – A useful criterion for indicating the amount of information contained within variables, and deciding whether to omit certain variables. AIC draws its justification from Information Theory.
- Coefficient of Determination – The value gives the amount of change in the value explained by change in the value.
- Adjusted Coefficient of Determination – Penalizes the Coefficient of Determination for the number of Predictors. Offers an alternative to AIC.
- Variance Inflation Factor – Quantifies the degree of multicollinearity in the model.
Akaike Information Criterion
Akaike Information Criterion, or AIC, is a measure of strength of a given model at describing the data, relative to competing models. As such, if all models fit poorly, AIC won’t give any indication of that. AIC was developed by Hirutugu Akaike, in 1974. It draws its justification from information theory.
where is the number of coefficients being calculated in the model, and is the maximized value of the likelihood function of the model.
AIC effectively penalizes the model for having too many predictor variables. The “best fit” is the one that minimizes the AIC, because it offers the best tradeoff of maximizing the log-likelihood, and minimizing the number of predictors in the model. Additional predictors are only included if they offer enough information to justify their inclusion.
For more information, visit:
- Wikipedia: Akaike Information Criterion
- NCSU CRSC: Akaike Information Criterion (pdf)
Coefficient of Determination
is the coefficient of determination. It is the percent of change in the dependent variable explained by change in the independent variables. We must first define some notation:
- is the mean of the observed values.
- is the predicted value for the dependent variable at the th observation.
- Sum of Squares – The Sums of squared values.
- is the total deviation from mean, or total variation in the data.
- is the total variation in the regression model. This is the value we are trying to maximize, with a theoretical ceiling at .
- is the total unexplained variation in the model. That is, the deviation from the predicted values. This is the value we are trying to minimize.
is an absolute measure of “Goodness of Fit” of a model. In order to get an absolute measure of fit that penalizes high numbers of predictors, we can use the Adjusted , denoted as .
Where , the degrees of freedom of the error term, and , the total degrees of freedom. (We subtract 1 degree of freedom for calculation of the mean, .)
is a useful criterion for comparing models and determining how well a given model represents the data. It penalizes adding more predictors to the model in a similar vein to AIC, but does so in a different manner. The models selected as “best” will not necessarily be the same, and AIC is generally preferred. In some disciplines (e.g., econometrics) both of these criteria are of limited use.
For more Information, visit:
- Wikipedia: Coefficient of Determination
- Iowa State: Regression Inference (pdf)
Variable Inflation Factor
The Variable Inflation Factor gives a numerical value to the severity of multicollinearity in a dataset. Multicollinearity is the degree to which some of the predictors in your dataset can be approximated as linear combinations of other predictors in your dataset. Ideally, we would have independent predictors in the dataset. Multicollinearity is the degree to which that is not the case, and VIF or Variance Inflation Factor, gives us a quantitative way of describing the Multicollinearity for a given variable.
For a given predictor variable in a predictive model, the VIF for that variable is a function of the value for a regression model predicting that variable against all other predictor variables in the predictive model. For Variable :
Where of a model . The value, as the coefficient of determination, is the percent of change in the dependent variable explained by change in the predictor variables. Therefore the higher the R^2, the more the variable can be approximated by the other variables in the model. The variable inflation factor is taken as a function of this. The higher the predictive ability of the other variables on that variable, the more the variance of the coefficient of that variable is, and the more difficult the inferences made with that variable are. As a general rule, we try to omit variables with a VIF higher than 5. That may be difficult to do, so we use this as simply a guideline.
None of these regression criteria are absolute, and many times they do not agree with eachother on what the best model is. You as a statistician must use your best judgement for the model selection.