Understanding the Independent blocks construction of an ergodic process

50 Views Asked by At

I've been wanting to understand the construction of an ergodic process $(Z_m)_{m \in \mathbb N}$ from a non-ergodic process $(X_m)_{m \in \mathbb N}$ for a long time. For this, consider the following Independent blocks problem on Durret's book:

Let $X_1, X_2, \ldots$ be a stationary sequence. Let $n<\infty$ and let $Y_1, Y_2, \ldots$ be a sequence so that $\left(Y_{n k+1}, \ldots, Y_{n(k+1)}\right), k \geq 0$ are i.i.d. and $\left(Y_1, \ldots, Y_n\right)=\left(X_1, \ldots, X_n\right)$. Finally, let $v$ be uniformly distributed on $\{1,2, \ldots, n\}$, independent of $Y$, and let $Z_m=Y_{v+m}$ for $m \geq 1$. Show that $Z$ is stationary and ergodic.

I have two questions:

  1. What is the sequence $Y_1, Y_2,...$? How to build it? From the statement of the theorem, I can only say that $\left(Y_1, \ldots, Y_n\right)=\left(X_1, \ldots, X_n\right)$. But for me, it's confusing to understand well the behavior of the entire sequence (I'm thinking more about the elaboration of a computer simulation)
  2. The second question is: how to show that $Z$ converges to $(X_1, X_2,....)$ in the weak topology, i.e., the convergence of finite-dimensional distributions?

Help!

My attempt

I will use $t_i$ instead of $m_i$. Suppose $t_1<t_2$ fixed. Note that for $n$ large enough, we have that $t_1 < t_2 < n$. So: $$(Z_{t_1}, Z_{t_2})= (Y_{t_1 + v}, Y_{t_2 + v})= (X_{n\kappa_1 + r_1}, X_{n\kappa_2 + r_2}):=W, \quad \kappa_i = 0,1; \, 1\leq r_i<n \,\, (i=1,2)$$ So, for $u=(u_1,u_2)$: $$ \varphi_W(u)= \mathbb E \left[ e^{ i u 'W } \right] = \mathbb E \left[ e^{ i (u_1 X_{n\kappa_1+r_1} + u_2 Y_{n\kappa_2+r_2} )} \right] $$ So: $$ \begin{aligned} \varphi_W(u)&= \mathbb E \left[ e^{ i (u_1 X_{t_1} + u_2 X_{t_2} )} \right]\mathbb P[\kappa_1 = 0,\kappa_2 = 0] \\ &+ \mathbb E \left[ e^{ i (u_1 X_{t_1} + u_2 X_{n+r_2} )} \right] \mathbb P[\kappa_1 = 0,\kappa_2 = 1] \\ &+ \mathbb E \left[ e^{ i (u_1 X_{n+r_1} + u_2 Y_{n+r_2} )} \right]\mathbb P[\kappa_1 = 1,\kappa_2 = 1] \end{aligned} $$ Using, $\mathbb P(\kappa_i = 0)=1- t_i/n$ and $\mathbb P(\kappa_i = 1)=t_i/n$, we have: $$ \begin{aligned} \varphi_W(u)=& \varphi_{(X_{t_1}, X_{t_2})}(u) (1- t_1/n)(1- t_2/n)\,\,\,\,+\\ &\mathbb E \left[ e^{ i (u_1 X_{t_1} + u_2 X_{n+r_2} )} \right] (1- t_1/n) t_2/n \,\,\,\,+\\ & \mathbb E \left[ e^{ i (u_1 X_{n+r_1} + u_2 Y_{n+r_2} )} \right]t_1/n\,\,t_2/n \end{aligned} $$ It shows that $\varphi_W(u)\to \varphi_{(X_{t_1}, X_{t_2})}(u)$, as $n \to \infty$.

1

There are 1 best solutions below

1
On
  1. You can build the sequence $(Y_j)$ in the following way: enlarging the probability space if needed, you can take independent copies of $(X_1,\dots,X_n)$, say $\left(V^{(k)}\right)_{k\geqslant 1}$ and let for $1\leqslant i\leqslant n$, take $Y_{nk+i}=V^{(k)}_{i}$.

  2. $Z$ is a sequence of random variables depending actually on $n$ because so does the random variable $v$ and $Y$. First look at the case of the two-dimensional distribution. You can find the limit of the characteristic function of $(Z_{m_1},Z_{m_2})$, first by simplifying by splitting the expectation over the set $\{v=k\}$, $1\leqslant k\leqslant n$ and notice that in most of the cases, for $n$ large, $k+m_1$ and $k+m_2$ belong to the same set of the form $\{nj+1,(n+1)j\}$.