I'm trying to figure out what this question is asking and what it is I'm trying to calculate exactly.
I'm told: You have cards 2-5 of each suit, except the 2 and 3 of the red cards. So 12 cards total.
I've calculated: All the probability distributions (conditional, joint), the entropies (conditional, joint), mutual information.
It sounds like the question is NOT asking for a conditional entropy? Or am I understanding entropy incorrectly? It's an AVERAGE amount of information, right? I'm not too sure what I'm trying to calculate...
Learning that the card is red takes you from 14 (not 12) equally likely possibilities to two equally likely possibilities. So now you are one bit short of complete knowledge of the card, whereas you originally were $\log_1 14$ bits short. You won $\log_214-1$ bits of information.