Independence of sums of iid Random Variables

31 Views Asked by At

I could not find the answer to the following question I have. Given $(\{Z_n\})_n$ iid and $ X(t)=\sum_{j=1}^{t}Z_j $, is it true that $ X(t_{n+1}) $ is independent to $ X(t_{n}) $, for $ t_n,t_{n+1}\in \mathbb{N} $ and $t_n < t_{n+1}$?

I would say so since \begin{align} P(X(t_{n+1})=i_{n+1}, X(t_n)=i_n)&=P(\sum_{j=1}^{t_{n+1}}Z_j=i_{n+1},\sum_{j=1}^{t_{n}}Z_j=i_n)\\\\ &=P(\sum_{j>t_n}^{t_{n+1}}Z_j+\sum_{j=1}^{t_{n}}Z_j=i_{n+1},\sum_{j=1}^{t_{n}}Z_j=i_n)\\\\ &=P(\sum_{j>t_n}^{t_{n+1}}Z_j+i_n=i_{n+1},\sum_{j=1}^{t_{n}}Z_j=i_n)\\\\ &=P(\sum_{j>t_n}^{t_{n+1}}Z_j=i_{n+1}-i_n,\sum_{j=1}^{t_{n}}Z_j=i_n)\\\\ &=P(\sum_{j>t_n}^{t_{n+1}}Z_j=i_{n+1}-i_n)P(\sum_{j=1}^{t_{n}}Z_j=i_n)\\\\ &=P(\sum_{j=1}^{t_{n+1}}Z_j=i_{n+1})P(\sum_{j=1}^{t_{n}}Z_j=i_n)=P(X(t_{n+1})=i_{n+1})P(X(t_n)=i_n), \end{align} where I simply plugged in the information known and used the fact that the sums in step 4 are clearly independent. Now I wonder if I could argue like that since I always thought I could do so. I could not find the answer elsewhere but feel free to correct me if there was already such an answer.

1

There are 1 best solutions below

0
On BEST ANSWER

Disproof. WLOG we can assume that $\mathbb E[Z_i]=0$ and $\operatorname{Var}[Z_i]=1$ for all $i$. For $n<m$ the covariance of $X_m$ and $X_n$ is $$ \mathbb E\Big[\Big(\sum_{i=1}^nZ_i\Big)\Big(\sum_{i=1}^mZ_i\Big)\Big]= \mathbb E\Big[\Big(\sum_{i=1}^nZ_i\Big)^2\Big]+\mathbb E\Big[\Big(\sum_{i=1}^nZ_i\Big)\Big(\sum_{i=n+1}^mZ_i\Big)\Big]\,. $$ Due to the independence of the $Z_1,...,Z_n$ from the $Z_{n+1},...,Z_m$ the second expectation on the RHS is zero. The first expectation is the number of the variances of $Z_1,...,Z_n$, hence it equals $n$ which is not zero.

In other words, $X_n$ and $X_m$ are correlated and can therefore not be independent.

Hint: In contrast to $X_n$ and $X_m$, the random variables $X_n$ and $X_m-X_n$ are independent.