I have two independent random variables $X$ and $Y$. I flip coin with distribution $U \sim Bernoulli(p)$. Consider the following random variable: $$ M=\begin{cases} X, & \text{if $U = 1$}\\ Y, & \text{if $U = 0$} \end{cases} $$
I would like to calculate the characteristic function of $M$: $$\varphi_M(s) = E\left[ e^{i s M} \right]$$ Note that $M$ is $X$ with probability $p$ and $Y$ wih probability $q = 1- p$. So, my first attempt is to do: $$\varphi_M(s) = E\left[ e^{i s M} \right]= e^{isX}p + e^{isY}q$$ But I think that this is wrong!
Can you help me?
Update
My intiuition says that
$$\varphi_M(s) = p \varphi_X(s) + q \varphi_Y(s)$$
where $\varphi_X(s)$ and $\varphi_Y(s)$ a, respectively, the ch. f. of $X$ and $Y$, but I don't know how to show whether this is true or false.
Supposing $U$ is independant of $X$ and $Y$ we have : \begin{eqnarray*} \varphi_M(s)&=&\mathbb E[e^{isM}(\mathbb 1_{U=1}+\mathbb 1_{U=0})]\\ &=&\mathbb E[e^{isM}\mathbb 1_{U=1}]+\mathbb E[e^{isM}\mathbb 1_{U=0}]\\ &=&\mathbb E[e^{isX}\mathbb 1_{U=1}]+\mathbb E[e^{isY}\mathbb 1_{U=0}]\\ &=&\mathbb E[e^{isX}]\mathbb E[\mathbb 1_{U=1}]+E[e^{isY}]\mathbb E[\mathbb 1_{U=0}]\\ &=&\varphi_X(s)\mathbb P(U=1)+\varphi_Y(s)\mathbb P(U=0)\\`&=&p\varphi_X(s)+q\varphi_Y(s) \end{eqnarray*}