Let $X1$ be a r.v on the alphabet $\mathcal X_{1} = \{1,..,m\}$ and $X2$ on $\mathcal X_{2} = \{m+1,..,n\}$.
Let $X$ a r.v distributed as :
$$\left\{\begin{matrix}X1 \ \text{with probability} \ \alpha
\\ X2 \ \text{with probability} \ 1- \alpha
\end{matrix}\right.$$
I need to find $H(X)$ in terms of $H(X1)$, $H(X2)$, $\alpha$.
So I started with $H(X) = - \sum_{x \in \mathcal X}{p_{X}(x)\cdot\log_2(p_X(x))}$
but we have that that $p_X(x) = p_{X1}(x)$ with probability $\alpha$ or $p_X(x) = p_{X2}(x)$ with probability $1-\alpha$. So $p_X(x)$ is a piecewise function, how can I write it in terms of $p_{X2}$ and $p_{X1}$ ?
We can't write $p_{X} = \alpha p_{X1} + (1 - \alpha)p_{X2}$ because it's fully distributed like $X1$ with probability $\alpha$ or fully distributed like $X2$ with probability $1-\alpha$..
We can assume the existence of a Bernoulli random variable $B$ with success parameter $\alpha$, independent of $X_1$ and $X_2$, such that $X=X_1 B + X_2(1-B)$. Let $p(\cdot)$ be the probability mass function of $X$. We have \begin{align} H(X) &= \mathbb{E}[-\ln p(X)]\\ % &= \mathbb{E}[-\ln p(X) | B=1] \alpha + \mathbb{E}[-\ln p(X) | B=0] (1-\alpha)\\ % &= \mathbb{E}[-\ln p(X_1)] \alpha + \mathbb{E}[-\ln p(X_2) ] (1-\alpha)\\ % &= H(X_1) \alpha + H(X_2) (1-\alpha) \end{align}