Suppose that I have a RV $Z$ that with probability $p$ can be a RV $X$ say $N(\mu,\sigma)$ and with probability $1-p$ can be a RV $Y$ say $\Gamma(\alpha,\beta)$.
Intuitively I think that $E(Z) = pE(X) + (1-p)E(Y)$
but I don't know how to formally show it. Something tells me that it has to do with conditional expectation Could anyone give me some hint on this?
thank you
This is the law of total expectation, which says that $$ \operatorname{E} (X) = \sum_i{\operatorname{E}(X \mid A_i) \operatorname{P}(A_i)}, $$ where $\{A_i\}_i$ is a finite or countable partition of the sample space (see here for more details).
Suppose that $W$ is a random variable such that $P(W=1)=p$ and $P(W=0)=1-p$. Then $Z=WX+(1-W)Y$ has the distribution that you describe, i.e. $Z$ is equal to $X$ with probability $p$ and to $Y$ with probability $1-p$. Using the law of total expectation, \begin{align*} \operatorname EZ&=\operatorname E(Z\mid W=1)P(W=1)+\operatorname E(Z\mid W=0)P(W=0)\\&=p\operatorname EX+(1-p)\operatorname EY. \end{align*}