# Autoregression – Definition

### Autoregression Definition

An Autoregressive model (AR model) is basically the method used to model a future or current behavior in a time series, using data from past behaviors in the same time series. The process is basically a linear regression of the variable performance in the current time series against the past performance of one or more variable in the same series.

A time series in this context refers to the sequence of data points listed on a graph and are usually taken in time order, example, the height of ocean tides which is taken at a specific point in time. So, in an AR model, one uses past data in such a time series to forecast and model an expected behavior if there is a correlation between selected values or data points and the values that precede and succeed them.

Linear regression is a statistical model that assumes that all natural phenomena are linearly related, that is, they follow a line. On a graph, the independent and dependent variables would plot a straight line when trying to determine the relationship. The AR model similarly ends up giving a linear graphical relationship.

### A Little More on What is Autoregressive Model

Generally,an AR  model assumes that the current series of events can be explained using the past or previous observation results. It uses past outcomes to forecast the value of the next time step or series. It is suitable where one wants to measure the influence of variables by correlating with the previous or past state.

When there is a degree of similarity between the values of the same variables based on related objects it is called autocorrelation. It is also referred to as serial correlation because of the way its series data structure is sequenced. A good situation where autocorrelation can be applied is for instance when you want to see how past security prices can impact future prices.

According to the AR model, an assumption is made that there exists a relationship between two variables. This type of relationship in variables can be positive or negative:

• Positive Correlation

A positive correlation is when both the variables changes together in the same direction. That is, the variables move up or down together. A good example of a positive correlation is when an iron bar increases in length as a result of temperature.

• Negative Correlation

A negative correlation is basically the opposite. It happens when both variables move in the opposite direction with a change in value. An example of negative correlation is when there is an increased demand for a particular commodity due to decrease in price.

• Zero Correlation

This is when there is no relationship between the two variables. When the value of one variable changes while the other one remains constant this is termed as zero correlation.

### Calculating Correlation

To calculate the strength and direction of a linear relationship between two variables, statistic measures can be used. The best way to measure the two variables is by using variables that show a linear relationship between them. The AR model is usually able to put pressure on the variable when modeling future behavior only when the correlation between the variable and specific lagged variable is stronger.

Through correlation statistics, you can use to tell a lag variable that is likely to be useful in a model and that which will not. However, there is a possibility that lag variables may show low or no correlation with the output variable. In this case, there may be difficulty in predicting the time series using the autoregression model. This can be useful when embarking on a new data set.

### AR Model Methods of Prediction

The predictions methods are divided into three namely:

1. Qualitative Prediction Method.

This is a prediction method that uses experts’ verdict rather than using numerical analysis. It is a mode that relies on individuals experience and analytical ability to give intuitions about the future outcome. In this case, logical thinking is applied to determine the outcome of the data.

Using economic data, the management can use qualitative prediction method to predict future trends such as sales trends. Some example of qualitative prediction models may include; the Delphi method, market research, historical life-cycle analogy, and informed opinion and judgment.

1. Time Series Prediction Method.

This mode of prediction uses time as an independent variable to produce demand. The forecast is made using only historical patterns in the data. The forecast is based on the past statistical data to establish a time series model which is then used to judge future values. Measurements, in this case, are taken at successive points or over successive periods. The measurement may be taken hourly, weekly, monthly or yearly.

1. Casual Model Prediction

Casual model prediction assumes that the variable to be predicted has a cause-and-effect relationship with one or more independent variables. The purpose of using casual forecast is to develop the best statistical correlation between a dependent variable and one or more independent variables.

Generally, It analyzes the project to be predicted and other related factors. This helps one to come up with a model that discloses the casual relationship. Best examples of casual prediction model include:

• Regression analysis
• Input-output method
• Econometric model, etc.

It is important to note that both casual model and time series model uses statistical data and statistical methods to predict the outcome. For this reason, the two can at times be referred to as a statistical forecast.

### Commonly Used AR Analysis and Prediction Methods

1. Time series analysis- This is a method used to predict future changes by establishing time series correlation identification models between comprehensive indices.
2. Investment analysis model- Commonly used to analyze markets
3. Neural network prediction model- This is a newly established method of time series analysis.
4. Other forecasting methods include Qualitative methods, Expert assessment, Markov model, Market research, Discriminant analysis, and Seasonal variation among others.

### Autoregressive Model Benefits

• The benefit of using this method is that the autocorrelation function can be used to tell if there is a lack of randomness.
• It can also forecast any recurring patterns in the data.
• Less information is required and self-variable series can be used to foretell the outcomes.

### Autoregressive Model Limitations

This method is subject to the following limitations:

• There must be autocorrelation coefficient which in this case should not be less than 0.5 for it to be suitable. If it happens to be less than 0.5, then it is seen as unsuitable and the prediction result is, therefore, more inaccurate.
• Can only be used when predicting things related to economics based on the pre-existing time.Used when something is considerably affected by social aspects. Because of its limitation, it is advisable not to use autoregressive model. Instead, the vector autoregressive model is highly recommended. This is because it can be used to predict multiple time series variables using a single model.

### References for Autoregression

The determination of the order of an autoregression, Hannan, E. J., & Quinn, B. G. (1979). Series B (Methodological), 190-195.

Threshold autoregression, limit cycles and cyclical data, Tong, H., & Lim, K. S. (1980). Journal of the Royal Statistical Society. Series B (Methodological), 245-292.

A no-arbitrage vector autoregression of term structure dynamics with macroeconomic and latent variables, Ang, A., & Piazzesi, M. (2003). Journal of Monetary economics, 50(4), 745-787.

Threshold autoregression with a unit root, Caner, M., & Hansen, B. E. (2001). Econometrica, 69(6), 1555-1596.

Fully modified least squares and vector autoregression, Phillips, P. C. (1995). Econometrica: Journal of the Econometric Society, 1023-1078.

The asymmetric effects of monetary policy: A nonlinear vector autoregression approach, Weise, C. L. (1999). Journal of Money, Credit and Banking, 85-108.

Vector autoregression and causality: a theoretical overview and simulation study, Toda, H. Y., & Phillips, P. C. (1994). Econometric reviews, 13(2), 259-285.

Quantile autoregression, Koenker, R., & Xiao, Z. (2006). Journal of the American Statistical Association, 101(475), 980-990.

Local polynomial estimators of the volatility function in nonparametric autoregression, Härdle, W., & Tsybakov, A. (1997). Journal of econometrics, 81(1), 223-242.

Order determination for a multivariate autoregression, Quinn, B. G. (1980). Journal of the Royal Statistical Society. Series B (Methodological), 182-185.

Estimation of panel vector autoregression in Stata, Abrigo, M. R., & Love, I. (2016). The Stata Journal, 16(3), 778-804.