Difference between autoregression and autocorrelation

5k Views Asked by At

I am struggling to understand the subtle difference between autoregression and autocorrelation.

I know that if there is autoregression $x(t)=\alpha x(t-1)$ and if there is autocorrelation $\text{corr}(x(t),x(t-1))>0$....

...but surely if there is autoregression then we have autocorrelation and vica versa.

Im sure other people out there are having this difficulty. Please extend this argument to the nth case if necessary, I chose 1st case for simplicity.

3

There are 3 best solutions below

1
On BEST ANSWER

For me, auto-correlation to auto-regression is the same as correlation to multiple regression. Namely, (auto)-correlation is the "state of the nature" that is modeled with auto-regression. However, can be modeled with moving-average as well, or both - e.g., AR(I)MA models. Assuming that the data were generated by some $AR(p)$ process is mostly too naive. Namely, as in linear regression - the most plausible view is that there is some structure of auto-correlation in your data that can be approximated (modeled) with an auto-regressive model.

0
On

Auto correlation is

$$\text{Cor}(X_t,X_{t-k})$$

and the regression coefficient in simple linear regression is

$$\beta = \text{Cor}(Y, X)\frac{\sigma_Y}{\sigma_X}$$

so

$$\alpha = \text{Cor}(X_t, X_{t-k}) \frac{\sigma_{X_t}}{\sigma_{X_{t-k}}}=\text{Cor}(X_t, X_{t-k})$$

if we assume that $\sigma_{X_t} = \sigma_{X_{t-k}}$.

1
On

Just rewriting Benjamin's answer


Simple linear regression without the intercept term (single regressor) $$\beta = \frac{\sum_i x_i y_i}{\sum_i x^2_i}.$$

Auto covariance function $$ \text{ACF}(\tau)=\sum_{t\in \mathbb{Z}} x_tx_{t-\tau}$$ where we extend $\{x_i\}_{i=0}^{n}$ by zero to a sequence on $\mathbb{Z}$.

So for the autoregression $x_t=\alpha_{\tau} x_{t-\tau}$ we have $$\alpha_{\tau} = \frac{\sum_{t\in\mathbb{Z}} x_t x_{t-\tau}}{\sum_{t\in\mathbb{Z}} x_t^2}=\frac{\text{AFC}(\tau)}{\text{AFC}(0)}.$$