Linear Regression - Explained
What is a Linear Regression?
- Marketing, Advertising, Sales & PR
- Accounting, Taxation, and Reporting
- Professionalism & Career Development
-
Law, Transactions, & Risk Management
Government, Legal System, Administrative Law, & Constitutional Law Legal Disputes - Civil & Criminal Law Agency Law HR, Employment, Labor, & Discrimination Business Entities, Corporate Governance & Ownership Business Transactions, Antitrust, & Securities Law Real Estate, Personal, & Intellectual Property Commercial Law: Contract, Payments, Security Interests, & Bankruptcy Consumer Protection Insurance & Risk Management Immigration Law Environmental Protection Law Inheritance, Estates, and Trusts
- Business Management & Operations
- Economics, Finance, & Analytics
What is a Linear Regression?
This is known in statistics as a linear approach to a scalar response's relationship with a single or multiple explanatory variables. Where one variable is involved, this approach is known as a simple linear regression and referred to as a multiple linear regression if multiple variables are included. However, it is very separate from the multivariate linear regression that predicts multiple correlated dependent variables instead of a single scalar one.
How is a Linear Regression Used?
Linear regression models the relationships using linear predictor functions which have unknown model parameters derived from the data and referred to as linear models. Usually, the response's conditional mean is assumed to be an affine function of the values of the explanatory variables, but occasionally some quantile or the conditional median is used.Linear regression majors on the response's conditional probability distribution which is based on the values of the predictors instead of the overall variable's joint probability distribution that is a part of the multivariate analysis.Linear regression has been critically analyzed and applied widely in practical applications. This results from the fact that models that linearly depend on their unknown parameters can be fitted more simply than those non-linearly dependent on their parameters. Another reason is that the statistical properties of the estimators got can be readily determined.Majority of the various applications of linear regression are classified under the following two groups:In case of an objective of forecasting or error reduction, linear regression can be utilized to align with a predictive model to observed values data set of the response and explanatory variables. After the model is created, it can be used to predict the response if the additional values of the explanatory variables are collected without an associated response value.In the case of an objective of explaining the variation in the response variable that is related to variation in the explanatory variables, this regression can be utilized to compute the strength of this relationship. It may also be used to assess if various explanatory variables have no linear relationship with the response or find out the subset of explanatory variables that have redundant information about the response.The least squares approach are usually used to fit the linear regression models although other ways may be used such as minimizing the lack of fit in some norm or reducing a penalized version of the least squares cost function like in ridge regression and in the lasso. The least squares approach can also fit non-linear models.