Computing $E(X+U|X)$ where $X$ is Bernoulli and $U$ normal

70 Views Asked by At

A network source sends a sequence of zeros and ones, $X_1, X_2, ...$ with $X_i$(iid) Bernoulli with $p = P(X_i = 1), 0 < p < 1$.

Due to disturbances the received sequence is $Y_1, Y_2, ...$ with $Y_i = X_i + U_i$. The $U_i$ are (iid) $N(0, \sigma^2)$ with $\sigma^2 = \frac{1}{3}$.

If $Y_i > \frac{1}{2}$ the received value is interpreted as $X_i = 1$ and $0$ otherwise. Ie. the $i$th signal sent if the received signal is greater than $\frac{1}{2}$, and $0$ otherwise.

I'm looking to determine the numbers $E(Y_i|X_i = 1)$ and $E(Y_i|X_i = 0)$, and the random variable $E(Y_i|X_i).$

(Also, I calculated $P(Y > \frac{1}{2}) = 0.6728p + 0.1932$ from an earlier part of the question).

Attempt

I'm not really sure how to go about this as we're dealing with a random variables $Y_i$ that are functions of discrete random variables $X_i$ and continuous random variables $U_i$...

We have $f_{Y_i}(y_i|X_i = 1) = 1$ if $y_i > \frac{1}{2}$ and $0$ otherwise.

$E(Y_i|X_i = 1) = \int_{-\infty}^{\infty}y_i f_{Y_i}(y_i|X_i = 1)dy$

$ = \int_{-\infty}^{\infty}(u_i + x_i)f_{Y_i}(y_i|X_i = 1)dy$

$ = \int_{-\infty}^{\infty}(u_i + 1)f_{Y_i}(y_i|X_i = 1)dy$

I'm not sure where to go from here. I'm totally lost here to be honest. Does anyone know how to approach it?

1

There are 1 best solutions below

4
On BEST ANSWER

If $U_i$ is independent of $X_i$ then $E(Y_i\mid X_i)=E(X_i\mid X_i)+E(U_i\mid X_i)=X_i+E(U_i)$ hence $E(Y_i\mid X_i)=X_i$. In particular, $E(Y_i\mid X_i=1)=1$ and $E(Y_i\mid X_i=0)=0$.

(Also, I calculated $P(Y > \frac{1}{2}) = 0.6728p + 0.1932$ from an earlier part of the question).

Actually, if $\sigma^2=\frac13$ then $P\left(Y > \frac{1}{2}\right) =\left(2\Phi\left(\frac{\sqrt3}2\right)-1\right)\cdot p+1-\Phi\left(\frac{\sqrt3}2\right)$. Numerically, this is $P\left(Y > \frac{1}{2}\right)\approx0.6135\cdot p+0.1932$.