Let $P=\{p_1, p_2, p_3 ..., p_n\}$ and $P^{'}= \left\{ \dfrac{(p_1 + p_2)}{2}, \dfrac{(p_1 + p_2)}{2}, p_3, ..., p_n\right\}$ be distributions on the same random variable $X$.
$1$. Show $H(X)\leq H(X^{'})$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -\sum_{i=1}^n p_i\log_{2}p_i $$
This make sense since $P^{'}$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)
$2$. Define $P^{''}=\left\{ p_1, ..., p_{i-1}, \dfrac{(p_i + p_j)}{2}, p_{i+1}, ..., p_{j-1}, \dfrac{(p_i + p_j)}{2}, p_{j+1}, ..., p_n \right\}$. Use the "permutation principle" and $(a)$ to show $H(X)\leq H(X^{''})$
There are several ways to attack this, some of them pointed by stochasticboy321's comment.
Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.
Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 \cdots p_n)$ and $p_B=(p_2,p_1, p_3 \cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.
Hence, by concavity $$H(X')=H(p_C) \ge \frac{H(p_A)+H(p_B)}{2}= H(p_A)=H(X)$$