Is $X_t=X_{t-1}^{\alpha} + \varepsilon_t$ stationary for $\alpha<1$?

320 Views Asked by At

Let {$\varepsilon_t$} be iid. Then, we have time series defined by $$X_t=cX_{t-1}^{\alpha} + \varepsilon_t,$$ with $0<\alpha<1$ and $c\in\mathbb{R}$ and let $\varepsilon_t$ be non-negative. Is it strictly stationary?

If we have $\alpha=1$ we obtain classic AR(1) process, where we need $c<1$ for stationarity. For lower $\alpha$ it seems that $X_t$ is "smaller" and should be also stationary, but I have a hard time proving that. Also, do we need then some restriction for $c$ in such case?

3

There are 3 best solutions below

0
On BEST ANSWER

If someone is interested, this problem is actually a special case of nonlinear autoregressive process.

They are defined as $X_t=f(X_{t-1}) + \varepsilon_t$, where some condition for stationarity (even ergodicity...) is: $f$ is some measurable function satisfying for some $c\in[0,1)$ and $K>0$ the following: $\|f(x)\|\leq c\|x\|+K$ for all $x\in\mathbb{R}$ and $\mathbb{E}|\varepsilon_t|<\infty$. Reference is here, and some generalization for heavy tailed noise (if we don't want the existence of moment assumption) is here.

11
On

One can see that it is not stationary in the special case that variance of the shock is zero. You have

$$X_1=cX_0^{\alpha}$$

$$X_2=cX_1^{\alpha}=c^{1+\alpha}X_0^{2\alpha}$$

and generally

$$X_t=c^{1+\alpha(t-1)}X_0^{t\alpha}$$

This can converge to different values depending on $X_0$. When $X_0=0$ it stays at zero. However, for example when $\alpha=0.01$, $c=0.99$ and $X_0=0.01$ it converges to $\approx0.99$. Hence it does not have a well-defined unconditional mean. I believe this argument could be generalized to the case with a strictly positive variance.

6
On

The answer will depend on your parameters and your choice of noise. I'll give partial answer, using the following reference: Peigné, Woess, Stochastic dyamical systems with weak contractivity properties. I. Strong and local contractivity, which contains many helpful other references (Goldie...).

First, I will exclude the value $0$ of the domain, so that $(X_n)$ takes its values in $\mathbb{R}_+^*$. I denote by $x := X_0$, and set:

$$Y_n^{\ln (x)} := \ln (X_n^x),$$.

$$\Psi_\epsilon (y) := \ln(ce^{\alpha y} + \epsilon).$$

Note that $0 \leq \Psi'_\varepsilon \leq \alpha < 1$ for all $\varepsilon$.

Then $Y_{n+1}^y = \Psi_{\epsilon_n} (Y_n^y)$, and $(\Psi_{\epsilon_n})_{n \geq 0}$ is an i.i.d. family of contraction mappings. The sequence $(Y_n)$ is a Stochastic Dynamical System (SDS). In addition, since all the mappings are $\alpha$-contracting, we get $\lim_{n \to + \infty} |Y_n^x-Y_n^y| = 0$ almost surely for all $x$, $y$. Using the terminology of the article, this SDS is strongly contractive.

Under another hypothesis, that the SDS be recurrent, Theorem 2.13 tells you that, up to multiplication by a constant, there is a unique invariant Radon measure $\nu$, for which the Markov chain $(Y_n)$ is ergodic, and supported exactly on the smallest nonempty closed subset $L$ such that $\mathbb{P} (\Psi_\epsilon (L) \subset L) = 1$. There are further conditions to guarantee that $\nu(L) < +\infty$ (positive recurrence). Which, I think, is more than what you ask for.

Now, let me go back to this recurrence criterion. It asserts that

$$\mathbb{P} (\liminf_{n \to + \infty} |Y_n^y| < +\infty) = 1$$

for one (or, equivalently here, every) $y$. Since the $\Psi_\varepsilon$ are bounded from below by $\ln (\varepsilon)$, only the upper bound is non-trivial. And here, it will depend on the distribution of the noise $\varepsilon$: if it is very (very, very, very) heavy-tailed, I can imagine that the Markov chain $(Y_n)$ is not recurrent. Anyway, let us get a upper bound: for positive $y$,

$$\Psi_\varepsilon(y) \leq \ln (c+\varepsilon) + \alpha y,$$

whence, if $(Y_m^y, \ldots, Y_{n+m}^y)$ are positive,

$$Y_{n+m}^y \leq \alpha^n Y_m^y + \sum_{k = 0}^{n-1} \alpha^k \ln(c+\varepsilon_{n+m-k}).$$

In particular, if $\lim_{n \to + \infty} Y_n^y = +\infty$, then

$$\lim_{n \to + \infty} \sum_{k = 1}^n \alpha^k \ln(c+\varepsilon_{n-k}) = +\infty.$$

But $\sum_{k = 1}^n \alpha^k \ln(c+\varepsilon_{n-k})$ has the same distribution as $\sum_{k = 0}^{n-1} \alpha^k \ln(c+\varepsilon_k)$. In particular, if $\mathbb{E} (\ln(c+\varepsilon)) < +\infty$, then $\sum_{k = 0}^{n-1} \alpha^k \ln(c+\varepsilon_k)$ converges in $\mathbb{L}^1$, so almost surely

$$\liminf_{n \to + \infty} \sum_{k = 1}^n \alpha^k \ln(c+\varepsilon_{n-k}) < +\infty,$$

and the Markov chain is recurrent. Hence, all you need for the existence and uniqueness of an invariant measure is that $\varepsilon >0$ almost surely and $\mathbb{E} (\ln(1+\varepsilon)) < +\infty$.