Stepwise Regression Definition
Stepwise regression is a statistical method of building a model in which an automatic selection of independent variables occur. This form of regression uses repetitive steps, in each step, there is a forward or backward selection of variables which is otherwise known as addition or removal of independent variables. The forward or backward selection is done using a specified criterion or series of tests.
A Little More on What is a Stepwise Regression
Forward and backward selection of independent variables is a core component of Stepwise Regression. In this statistical method, each independent variable can be tested in a step at a particular time in order to determine whether the variable is statistically significant. Another method of testing independent variables in a stepwise regression is any adding all the variables in the model so as to remove variables that are not statistically significant.
There are two tests used in determining whether independent variables in a regression model are statistically significant or otherwise. These tests are; F-tests, T-tests, adjusted R squared and other methods.
Drawbacks of Stepwise Regression
Given that the stepwise regression involves an automatic selection of independent variables that are statistically significant, it saves individuals lots of time. However, there are certain drawbacks of stepwise regression as argued by statisticians. The major drawbacks include;
- Stepwise regression is inherently bias.
- There is a tendency for stepwise regression to give incorrect selection and results.
- It uses complex regression models.