Let $X$ be a RV with values in the set $S = \{1, \ldots, n\}$ and probability distribution $p_k$ for $k = 1, \ldots , n$. Let $T \subseteq S$ and suppose that $P(X \in T) = \alpha$. Let $Y$ be a RV with values in $\{0, 1\}$ and with $Y = 1$ if $X \in T$ and $Y = 0$ otherwise.
Compute $H(X\mid Y)$ and $H(Y \mid X)$.
I know that the formula for mutual entropy is as follows:
$$H(X \mid Y) = - \sum_{x,y \in \mathcal{X}} p_{xy} \log \biggl(\frac{p_{xy}}{p_y}\biggr)$$
But I do not see how to compute them in this case as here are no "numbers". What is clear to me however is that knowledge of $X$ offers total knowledge of $Y$, but I do not quite see how this could help here. Could you help me?
You can calculate the mutual entropy by noting that whenever $Y=1\iff x\in T$ and thus
$$\begin{align}H(X|Y)&=-(\sum_{Y=0,x\notin T}+\sum_{Y=1,x\in T})p_{XY}\log{\frac{p_{XY}}{p_Y}}\\&=-\sum_{i\in T}p_i\log\frac{p_i}{\alpha}-\sum_{i\notin T}p_i\log\frac{p_i}{1-\alpha}\\&=-\sum_{i\in S}p_i\log p_i+\alpha \log\alpha+(1-\alpha)\log(1-\alpha)\\&=S(X)-S(Y)\end{align}$$
This concurs with our intuition that the outcomes of the variable $Y$ are giving us information about the value of the variable $X$, and therefore they should be reducing the entropy (the lack of information) of $X$.
Obviously, as pointed out in one of the comments, $Y$ is completely determined by $X$ and therefore, $H(Y|X)=0$, since $p_{XY}=p_{X}$ for all possible values of $X,Y$.