Let $X_1, X_2, \dots,X_n$ independent random variables.

223 Views Asked by At

Let $X_1, X_2, \dots,X_n$ independent random variables such that for $1\le i \le n$ the distribution of $X_i$ is $$\mathbb{P}(X_i=2^i)=\frac{1}{2^i}, \space \space \mathbb{P}(X_i=1)=\frac{1}{2}-\frac{1}{2^{i+1}}, \space \space\mathbb{P}(X_i=-1)=\frac{1}{2}-\frac{1}{2^{i+1}}$$

For $1\le i \le n$, we'll define the event $A_i = \{X_i\ne2^i\}$ and the random variable $Y_i=X_i\cdot \mathbb{1}_{A_i}$.

(a) Prove: For all $i$: $\mathbb{E}Y_i=0$

(b) For $1\le k \le n$, we'll define the event $B_k$ that there exists $k\le i \le n$, such that: $X_i\ne Y_i$. Prove the following inequality: $$\mathbb{P}(B_k)\le\frac{1}{2^{k-1}}$$

My try:

(a) $Y_i$ takes the values $0, 1$ or $-1$:

$\mathbb{E}Y_i=\mathbb{P}(Y_i=1)\cdot1+\mathbb{P}(Y_i=-1)\cdot(-1)+\mathbb{P}(Y_i=0)\cdot0$

Since: $\mathbb{P}(Y_i=1)=\mathbb{P}(X_i=1)\cdot\mathbb{P}(\mathbb{1}_{A_i}=1)=\mathbb{P}(X_i=-1)\cdot\mathbb{P}(\mathbb{1}_{A_i}=1)=\mathbb{P}(Y_i=-1)$

Then, the result above is 0. Can anyone validate that proof?

(b) I'm struggling a bit with the definitions. How is that possible that $X_i=Y_i$ as $Y_i$ takes the values: $0, 1$ or $-1$ and $X_i$ takes $2^i, 1$ or $-1$, so they would never be equal.

I hope you could help me get the definitions straight.

1

There are 1 best solutions below

0
On

Observe that $$B_k=\bigcup_{i=k}^nA_i^c$$ because the only way that $X_k$ and $Y_k$ differ is that the indicator of $A_i$ vanishes. Since the probability of $A_i^c$ is $2^{-i}$, it follows that $$ \Pr\left(B_k\right)\leqslant \sum_{i=k}^n2^{-i}=2^{-k}\underbrace{\sum_{j=0}^{n-k}2^{-j}}_{\leqslant 1/2}. $$