Question: If $X,Y,Z$ are random variables, with $Z = X$ with probability half and $Z=Y$ with probability half, is $E(Z) = .5E(X) + .5E(Y)$? (as the question says). Additionally, is this simply trivial (or by definition) or does it follow from something?
I ask because I am used to, when taking the expectation of something discrete, say a function $g(T)$, with $g$ a function and $T$ a discrete R.V., we have $E(g(T)) = \sum g(x) P(X=x)$ not $\sum E(g(x)) P(X=x)$ (which I guess are equivalent if $g(x)$ is a scalar)
Basically, what if a random variable has realizations of other random variables. How is expectation handled then?
To share my motivation: If we have an estimator $W(X)$ for some parameter $\mu$, and loss is mean square, then risk is $$R(\mu,W(X))= E((W(X) - \mu)^2)$$
If $W(X)$ can only take one value, then I can evaluate $E((W(X) - \mu)^2)$. I am confused however if, say $W(X) = X$ with probability $\frac{1}{3}$ and $X^2$ with probability $\frac{2}{3}$, then I think $$R(\mu, W(X)) = R(\mu, W(X)=X)\frac{1}{3} + R(\mu, W(X) =X^2) \frac{2}{3}$$
but I don't exactly see why this is.
Thanks.
No. Let $X \sim Be(0.5)$, $Y \sim N(0,1)$, $A = \{X=0\}$ and $Z = X1_A + Y1_{A^C}$ where $X$ and $Y$ are independent.
$$E[Z] = E[X1_A] + E[Y1_{A^C}] = E[0 \ \times \ 1_A] + E[Y]E[1_{A^C}] = E[0] + 0(0.5) = 0$$
However,
$$0.5E[X] + 0.5E[Y] = 0.5(0.5) + 0.5(0) = 0.25$$
Observe that $X$ and $1_A$ are not independent (Actually, $X=1_{A^C}$).
In your case, independence is assumed I guess:
$$E[W^2] = E[(X1_A + X^21_{A^C})^2] = E[X^21_A + X^41_{A^C}]$$
$$ \stackrel{?}{=} E[X^2]E[1_A] + E[X^4]E[1_{A^C}]$$