Let $x,y$ be stochastic variables and $p$ be a probability measure.
$I[X,Y] :=- \sum_{x,y} p(x,y)\ln{p(x,y)}$ with $p(x,y) := P(X=x, Y=y)$
I have to show that $I[X, Y] \le I[X]+I[Y]$
I mustn't assume that $X$ and $Y$ are independent. My problem is how to write $p(x,y)$ into $p(x)$ and $p(y)$.
Any tipps or ideas on how to separate the p(x,y) and show the inequality? Thanks in advance!
Conditional Entropy is less than entropy is similar, but it doesn't fully answer my question (their definition of $I$ is different: over there: $I[X,Y] = \sum_{x,y} p(x,y) \ln \frac{{p(x,y)}}{p(x)p(y)}$
By the chain rule for entropy, $$ I[X,Y] = I[X] + I[Y|X] $$
Conditioning reduces entropy, so $$ I[Y|X] \leq I[Y] $$
Thus, $$ I[X,Y] \leq I[X] + I[Y] $$