Show that $X_{t}:=\alpha X_{t-1}+\epsilon_{t}$ is strictly stationary for $|\alpha|<1$ and $\epsilon_{t}$ i.i.d$~\sim N(0,\sigma^{2})$.

81 Views Asked by At

The title can be shortened to "prove that $AR(1)$ processes are strictly stationary when $|\alpha|<1$". This has been discussed many times on MSE and Cross Validated, but I found no mathematical proof of why it is strictly stationary.

For $t\in\mathbb{Z}$, consider the process recursively defined as $X_{t}:=\alpha X_{t-1}+\epsilon_{t}$ where $\epsilon_{t}$s are i.i.d $\sim N(0,\sigma^{2})$. I want to show that the process $\{X_{t}:t\in\mathbb{Z}\}$ is strictly stationary when $|\alpha|<1$.


I have a not-bad attempt, but I got stuck in the end.

First, by the recursive relation, it is easy to see that $$X_{t}=\alpha^{n}X_{t-n}+\sum_{k=0}^{n-1}\alpha^{k}\epsilon_{t-k}.$$ Define $Y_{n}:=\sum_{k=0}^{n-1}\alpha^{k}\epsilon_{t-k}$. We recall that any linear combination of independent Gaussian random variables is Gaussian. In particular, since $\epsilon_{t}$ is i.i.d $N(0,\sigma^{2})$, it follows that $Y_{n}\sim N(0,\sigma^{2}\sum_{k=0}^{n-1}\alpha^{2k})$. We set $\sigma_{n}^{2}:=\sigma^{2}\sum_{k=0}^{n-1}\alpha^{2k}$. Note that since $|\alpha|<1$, when $n\rightarrow\infty$, the variance converges $$\sigma_{n}^{2}\longrightarrow\sigma^{2}\sum_{k=0}^{\infty}\alpha^{2k}=\dfrac{\sigma^{2}}{1-\alpha^{2}}.$$ Consider the characteristic function $\varphi_{n}(t)$ of $Y_{n}$. It is of the following form $$\varphi_{n}(t):=\mathbb{E}(e^{itY_{n}})=e^{it\mu_{Y_{n}}-\frac{1}{2}\sigma^{2}_{Y_{n}}t^{2}}=e^{-\frac{1}{2}\sigma^{2}_{n}t^{2}}\longrightarrow e^{-\frac{1}{2}\frac{\sigma^{2}}{1-\alpha^{2}}t^{2}}=:\varphi(t).$$ We note that $\varphi(t)$ is the characteristic function of $Y\sim N(0,\frac{\sigma^{2}}{1-\alpha^{2}})$ and $\varphi(t)$ is continuous at $t=0$. Hence, in view of the Lévy's continuity theorem, it follows that $Y_{n}\longrightarrow Y\ \text{in distribution}$ when $n\rightarrow\infty$. Now, since $X_{t}=\alpha^{n}X_{t-n}+Y_{n}$ and since $|\alpha|<1$, we have $$X_{t}=\alpha^{n}X_{t-n}+Y_{n}\longrightarrow Y,\ \text{in distribution, as}\ n\rightarrow\infty.$$


This is where I got stuck. From here, we can see that, since the argument works for every $t\in\mathbb{Z}$, it follows that $X_{t}=_{d}Y$ for all $t$. So, every random variable in the process is Gaussian with the same parameters.

This does not imply strict stationarity directly because it does not specify the joint distribution. However, since $X_{t}$s are Gaussian, weak stationarity and strict stationarity are equivalent.

I know that $\mathbb{E}X_{t}=\mathbb{E}Y=0$ and $\mathbb{E}X_{t}^{2}=\mathbb{E}Y^{2}=\frac{\sigma^{2}}{1-\alpha}.$ However, I still don't know how to compute $\text{Cov}(X_{t},X_{s})$. Don't I need the joint distribution condition of $(X_{t},X_{s})$ for this?


An extension from this is the following question:

  1. How can I show that $X_{t}$ is not strictly stationary when $\alpha=1$ (let us assume real space).

I know that when $\alpha=1$, the above argument does not work. And the variance blows up. But that has nothing to say about the strict stationarity.

  1. What about $|\alpha|>1$?
1

There are 1 best solutions below

11
On BEST ANSWER

Let's just prove the general result when $\epsilon_t$ is in $L^2$. Notice that by a summability argument, using the fact that $|\alpha|<1$, we get that $$X_t = \sum_{k=0}^\infty \alpha^k \epsilon_{t- k} = f(\epsilon_t, \epsilon_{t-1}, \epsilon_{t-2}, \ldots)$$

where $f: \mathbb{R}^\infty \to \mathbb{R}^\infty$ is a measurable function. The sequence $(\epsilon_k)_{k \in \mathbb{Z}}$, being independent, satisfies $(\epsilon_{t})_{t \in \mathbb{Z}} \stackrel{D}{=}(\epsilon_{t+h})_{t \in \mathbb{Z}}$ for all $h\in \mathbb{Z}$. Let $t_1 < \ldots < t_l$ be times, $h \in \mathbb{Z}$, and $B$ be a Borel set in $\mathbb{R}^l$ then: \begin{align*} \mathbb{P} \left(\begin{bmatrix} X_{t_1} \\ X_{t_2} \\ \vdots \\ X_{t_l} \end{bmatrix} \in B \right) &= \mathbb{P} \left(\begin{bmatrix} f(\epsilon_{t_1}, \epsilon_{t_1 -1 }, \ldots ) \\ f(\epsilon_{t_2}, \epsilon_{t_2 - 1}, \ldots ) \\ \vdots \\ f(\epsilon_{t_l}, \epsilon_{t_l -1}, \ldots ) \end{bmatrix} \in B \right) \\ &= \mathbb{P} \left(\begin{bmatrix} f(\epsilon_{t_1 +h}, \epsilon_{t_1 -1 +h}, \ldots ) \\ f(\epsilon_{t_2+h}, \epsilon_{t_2 - 1 +h}, \ldots ) \\ \vdots \\ f(\epsilon_{t_l+h}, \epsilon_{t_l -1+h}, \ldots ) \end{bmatrix} \in B \right) \\ &= \mathbb{P} \left(\begin{bmatrix} X_{t_1+h} \\ X_{t_2+h} \\ \vdots \\ X_{t_l+h} \end{bmatrix} \in B \right) \end{align*} Which means that $(X_{t_1} , \ldots X_{t_l}) \stackrel{D}{=} (X_{t_1+h} , \ldots X_{t_l+h})$, showing that $(X_t)$ is strictly stationary.

For your extensions:

  1. for $|\alpha| = 1$ you can show that the variance of the process must blow up and hence it cannot be stationary.

  2. For $|\alpha| > 1$ the process still admits a strictly stationary solution, but it is now future-dependent. To find out the $MA(\infty)$ form, try recursing forward in time to express $X_t$ as a function of $\epsilon_{t+1}, \epsilon_{t+2}, \ldots$