let $X_0, X_1, ...$ be iid random variables with Bernoulli distribution. Suppose that $T_n = \Sigma_{i=0}^{n-1} \mathbb 1_{(X_i =1, X_{i+1}=1)}$

621 Views Asked by At

For $p \in (0,1)$ unknown, let $X_0, X_1, ...$ be iid random variables with Bernoulli$(p)$ distribution, i.e. $P(X_i =1)=p$. Suppose that $T_n = \Sigma_{i=0}^{n-1} \mathbb 1_{(X_i =1, X_{i+1}=1)}$ is observed.

Calculate the mean and variance of $T_n$.

I can see $T_n$ can be $0,1,2,3,...,n$;it looks like a binomial distribution. My confused part is what if when they are adjacent to each other. For example, suppose when $T_n =2$, we can have something like ${(X_1 =1, X_{2}=1)}$ and ${(X_2 =1, X_{3}=1)}$ OR ${(X_1 =1, X_{2}=1)}$ and ${(X_5 =1, X_{6}=1)}$.

1

There are 1 best solutions below

0
On BEST ANSWER

For the mean, you don't have to worry about that: expectation always breaks up over summation. So, $$ \mathbb{E}[T_n]=\sum_{i=0}^{n-1}\mathbb{E}[1_{\{X_i=1,X_{i+1}=1\}}]=\sum_{i=0}^{n-1}P(X_i=1\text{ and }X_{i+1}=1)=np^2. $$

For the variance, you DO have to consider it. Let's compute $\mathbb{E}[T_n^2]$ to start with: $$ \begin{align*} \mathbb{E}[T_n^2]&=\mathbb{E}\left[\left(\sum_{i=0}^{n-1}1_{\{X_i=1,X_{i+1}=1\}}\right)^2\right]\\ &=\mathbb{E}\left[\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}1_{\{X_i=1,X_{i+1}=1\}}1_{\{X_j=1,X_{j+1}=1\}}\right]\\ &=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}\mathbb{E}[1_{\{X_i=1,X_{i+1}=1\}}1_{\{X_j=1,X_{j+1}=1\}}]. \end{align*} $$ In order to compute this, you need to simplify each of these products of indicators. But how?

As a hint: break it down in to cases, based on how $i$ and $j$ are related. For instance, if $i$ and $j$ differ by two or more, then the events $X_i=1$, $X_{i+1}=1$, $X_j=1$, and $X_{j+1}=1$ are all independence, and so the given expectation is just $p^4$.

Can you see how to move forward like this? Once you have, just remember that $\text{Var}[T_n]=\mathbb{E}[T_n^2]-\mathbb{E}[T_n]^2$.