Is the finite difference of an iid discrete-time stochastic process also iid, or a Markov process?

89 Views Asked by At

Let $\{X_n\}$ be a discrete-time stochastic processes, i.e. a sequence of random variables $X_n,\ n \in \mathbb{N}$. Assume that the $X_n$ are independent and identically distributed (iid). Let $\{Y_n = X_{n+1} - X_n\}$ be the forward finite difference sequence for $\{X_n\}$. Are the $Y_n$ also iid?

My intuition is no, because of regression toward the mean: roughly speaking, if $X_n < \bar{X}$ (where the overbar means expectation value), then $X$ will probably increase at the next time step, so that $\overline{Y_n} > 0$, and similarly if $X_n > \bar{X}$ then $\overline{Y_n} < 0$. I guess the way to formalize this is that the marginal distribution for $Y_n$ is not independent of the marginal distribution for $X_n$ (for a fixed value of $n$). Along the same lines, a big jump above/below the mean is likely to be followed by a big jump back toward the mean, so I think that $\mathrm{cov}(Y_n, Y_{n+1}) < 0$. If so, how can I show this? Are non-adjacent $Y_n$'s also dependent, or is $\{Y_n\}$ a Markov process?

1

There are 1 best solutions below

2
On

In general the answer is no. Suppose $\sigma^2=\operatorname{Var}(X_n)$ is finite and positive. Then you can use the fact that $\operatorname{Cov}(X_n,X_m)=0$ if $m\ne n$ and $=\sigma^2$ if $m=n$ to compute $$\operatorname{Cov}(Y_n,Y_{n+1}) = \operatorname{Cov}((X_{n+1}-X_n), (X_{n+2}-X_{n+1})) = \operatorname{Cov}(X_{n+1},-X_{n+1})=-\sigma^2.$$

So $Y_n$ and $Y_{n+1}$ are correlated, and hence not independent.

If $\sigma^2=0$, the $X_n$ are not really random, and the $Y_n$ sequence degenerates to a sequence of non-random 0 values, which is technically i.i.d., but uninteresting.

One way to start to prove the $Y_n$ are not independent in the infinite variance case is to evaluate the 2-dimensional characteristic function $\phi(u,v)=E \exp(it( uY_n+vY_{n+1})),$ which works out to $\varphi(-t)\varphi(t-u)\varphi(u)$ where $\varphi(t)=E\exp(itX_n)$ is the characteristic function of $X_n$. If $Y_n$ and $Y_{n+1}$ were independent then $\phi$ would factor as $\phi(t,u)=\phi(t,0)\phi(0,u)$, which translates into a condition on the $\varphi$ function that is typically not satisfied. (I think it boils down to $\varphi(u-t)=\varphi(u)\varphi(-t)$ for all $t,u$ close to $0$. Which means that the original $X_n$ are degenerate.)