Independence of random variables that are functions of a sequence of stopping times

54 Views Asked by At

Consider a sequence of i.i.d. random variables $\{Z_n\}^\infty_{n=1}$ forming a filtration $\{\mathcal{F_n}\}_{n=1}^\infty$. Consider a stopping time $T$, and let $T_k$ the stopping time obtained by applying $T$ to $\{Z_l\}^\infty_{l=k}$ where $k \in \mathbb{N}$. I.e., in order to decide if e.g., $T_3 = 2$, we need to observe $Z_3$ and $Z_4$. Also, consider the sequence of random variables

\begin{equation} U_k= \left\{ \begin{array}{ll} 1& \text{if }\quad T_k<\infty \\ 0& \text{if }\quad T_k=\infty. \\ \end{array} \right. \end{equation} I am reading the paper "Procedures for reacting to a change in distribution", by G. Lorden. There the Strong Law of Large Numbers is applied to $\{U_k\}^\infty_{k=1}$ after claiming that the ergodic hypothesis is true for $\{Z_n\}^\infty_{n=1}$. However, I can not see how $\{U_k\}^\infty_{k=1}$ are independent random variables. Can someone provide any explanation? Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

They're not necessarily independent. Let the $Z_i$ be Bernoulli coin flips and let $T=T_1$ be $2$ if $Z_1 = Z_2 = 1$ and $\infty$ otherwise. Then $T_2$ is $3$ if $Z_2=Z_3=1$ and $\infty$ otherwise.

Clearly $P(U_1 = 1) = P(U_2 = 1) = 1/4$. But $P(U_1 = U_2 =1) = P(Z_1 = Z_2 = Z_3 = 1) = 1/8$, whereas if they were independent you should get $1/16$.

However, the conclusion of the SLLN, that $\frac{1}{n} \sum_{i=1}^n U_i$ converges almost surely to $E[U_i]$, should be true as a consequence of, say, the Birkhoff ergodic theorem, since they are shifts of $U_1$, and the shift is measure-preserving for iid product measure.