I know that for two system $X$ and $Y$, we can write :
$$ H(X,Y)=H(X)+H(Y)-I(X,Y)$$
Where $I$ is called the mutual information and $H$ is the shannon entropy.
My question is : do we have another equation constraining $I$, or is it the only one ?
Like, let's assume I know the entropy of $X$ and $Y$ : $H(X)$ and $H(Y)$, do I really miss one information to deduce $I$ ?
For example, if $H(Y)$ vanishes, I know that I directly have $I(X,Y)=H(Y)=0$.
But this is a "boundary" case, it comes from the fact :
$$0 \leq I(X,Y)\leq H(Y)$$
So my question is : in a "general" case, how can I know if I actually have another relationship between my quantities in addition of $ H(X,Y)=H(X)+H(Y)-I(X,Y)$ ?
It doesn't look this obvious for me.
There are other formulas, but they always involve either the conditional or joint entropies, e.g. $$ I(X,Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) $$ The reason for this is that the mutual information measures the information that $X$ holds about $Y$ (and vice versa). The marginal entropies $H(X)$ and $H(Y)$ only tell you information intrinsic to the distributions of $X$ and $Y$ separately. Notice that $$ H(X) = -\sum_i p(x_i)\log p(x_i) $$ So if you ask:
The answer is yes, you are missing the information regarding the connection between $X$ and $Y$.
One way to see this is via the identity: $$ \mathcal{D}_\text{KL}[p(x,y)||p(x)p(y)] = I(X,Y)$$ which suggests you need information on the joint distibution to compute $I$.