AR(1)-process: Conditional distribution

1.5k Views Asked by At

Consider I have an AR(p)-process $ (X_{t})_{t\in Z } $, for example the following AR(1)-process:

$$X_{t}=\alpha_{0} + \alpha_{1}X_{t-1}+\epsilon_{t}$$ where $ \epsilon_{t}\sim D(0,\sigma^2) $ is an uncorrelated, zero-mean, finite variance process (White Noise) with distribution function $D$ (e.g. normal distribution).

Given the information set $\Phi_{t-1}$, the conditional distribution of $X_t$ is $$X_t|\Phi_{t-1}\sim D(\alpha_{0} + \alpha_{1}X_{t-1},\sigma^2).$$

  1. How do we know that $\epsilon_{t}$ is independent of all past values {$X_{t-1},X_{t-2}...$}? The assumption that $\epsilon_{t}$ is White Noise does not imply its independence of past $X_t$.

So I guess "$\epsilon_{t}$ is independent of {$X_{t-1},X_{t-2}...$}" is just another assumption for AR-processes?

  1. Saying that $\epsilon_t$ is independent of the infinite set {$X_{t-1},X_{t-2}...$} means that $\epsilon_t$ is independent of every subset of {$X_{t-1},X_{t-2}...$}?
1

There are 1 best solutions below

8
On
  1. How do we know that $\epsilon_{t}$ is independent of all past values {$X_{t-1},X_{t-2}...$}? The assumption that $\epsilon_{t}$ is White Noise does not imply its independence of past $X_t$.

"White noise" is sometimes (not always) taken to imply that terms are independent, not just uncorrelated. If $\epsilon_t$ is independent of all previous $\epsilon_{t-n}$ then it must also be independent of any function of those variables.

But $X_{t-n}$ can be expressed as an infinite sum involving only $\alpha_{0}$. $\alpha_{1}$, and $\epsilon_{t-n}, \epsilon_{t-n-1}, \epsilon_{t-n-2}, ...$ and hence $\epsilon_t$ must be independent of $X_{t-n}$ for positive $n$.

If we apply only the "uncorrelated" interpretation, then it's definitely not true that $\epsilon_t$ is independent of past X-values.

(Trivial example: choose any distribution for $\epsilon$ that is uncorrelated but not independent, and set $\alpha_0 = \alpha_1 = 0$.)