If $X,Y,Z$ are random variables, with $Z = X$ with probability half and $Z=Y$ with probability half, is $E(Z) = .5E(X) + .5E(Y)$?

50 Views Asked by At

Question: If $X,Y,Z$ are random variables, with $Z = X$ with probability half and $Z=Y$ with probability half, is $E(Z) = .5E(X) + .5E(Y)$? (as the question says). Additionally, is this simply trivial (or by definition) or does it follow from something?

I ask because I am used to, when taking the expectation of something discrete, say a function $g(T)$, with $g$ a function and $T$ a discrete R.V., we have $E(g(T)) = \sum g(x) P(X=x)$ not $\sum E(g(x)) P(X=x)$ (which I guess are equivalent if $g(x)$ is a scalar)

Basically, what if a random variable has realizations of other random variables. How is expectation handled then?

To share my motivation: If we have an estimator $W(X)$ for some parameter $\mu$, and loss is mean square, then risk is $$R(\mu,W(X))= E((W(X) - \mu)^2)$$

If $W(X)$ can only take one value, then I can evaluate $E((W(X) - \mu)^2)$. I am confused however if, say $W(X) = X$ with probability $\frac{1}{3}$ and $X^2$ with probability $\frac{2}{3}$, then I think $$R(\mu, W(X)) = R(\mu, W(X)=X)\frac{1}{3} + R(\mu, W(X) =X^2) \frac{2}{3}$$

but I don't exactly see why this is.

Thanks.

2

There are 2 best solutions below

0
On BEST ANSWER

No. Let $X \sim Be(0.5)$, $Y \sim N(0,1)$, $A = \{X=0\}$ and $Z = X1_A + Y1_{A^C}$ where $X$ and $Y$ are independent.

$$E[Z] = E[X1_A] + E[Y1_{A^C}] = E[0 \ \times \ 1_A] + E[Y]E[1_{A^C}] = E[0] + 0(0.5) = 0$$

However,

$$0.5E[X] + 0.5E[Y] = 0.5(0.5) + 0.5(0) = 0.25$$

Observe that $X$ and $1_A$ are not independent (Actually, $X=1_{A^C}$).


In your case, independence is assumed I guess:

$$E[W^2] = E[(X1_A + X^21_{A^C})^2] = E[X^21_A + X^41_{A^C}]$$

$$ \stackrel{?}{=} E[X^2]E[1_A] + E[X^4]E[1_{A^C}]$$

2
On

$Z$ here is an example of what is known as a mixture distribution, sampled by the following process:

  1. A distribution is used to choose one from a set of other distributions.
  2. The chosen distribution is sampled.

It is trivial that the expected value of $Z$ is the mean of the expected values of $X$ and $Y$. More generally, the mixture distribution $M$ with sub-distributions $M_1,M_2,\dots,M_n$ and corresponding probabilities of being selected $w_1,w_2,\dots,w_n$ has expected value $$E[M]=\sum_{i=1}^nw_iE[M_i].$$