$$S = \{ X, Y, Z, W\};\\P(X) = 0.1;\\P(Y) = 0.5;\\P(Z) = p;\\P(W) = q$$
I don't know how to find source (S) entropy with these 2 unknown $p$ and $q$ probabilities.
With which $p$ and $ q$ values this entropy gets maximum value and why?
Formula for 1 exercise is $\sum_{i=1}^n p_i \log_2 \frac{1}{p_i} $. So with my known data it looks like this $$0.1\log_2\frac{1}{0.1} + 0.5\log_2\frac{1}{0.5} + p\log_2\frac{1}{p} + q\log_2\frac{1}{q}= 0.832 +p\log_2\frac{1}{p} +q\log_2\frac{1}{q} $$
And the 2nd exercise:
$$0.1\log_2\frac{1}{0.1} + 0.5\log_2\frac{1}{0.5} + 2(0.2\log_2\frac{1}{0.2}) = 1.761$$
Does it seem right?
This more a comment than a solution, but i still lack reputation to post it there. The formula you posted is the one for Shannon-Entropy. I assume that you know that: $$P(X)+P(Y)+P(Z)+P(W)=1$$ While each of the summands is positive. A coin that has S=Head,Tail and P(Head)=0.1 has an lower Entropy throw of just $(0.47)$ than one were both are sites are equally likely resulting in $(1.0)$ bit of Entropy. There is a tendency that overall Entropy is highest when all events are more equally likely. (Think about the Coin Example, see Example here)
This is why you need to choose values for $p$ and $q$ such that $p = q $ which leads to $p$ being $0.2$. $(p=\dfrac{1-P(X)-P(Y)}{2})$.
I recommend you to try calculating the Entropy for the Data "XYWZ" with changed (p, q) values. Since i am not 100% sure.