Mutual information of discrete and continous stochastic variable

354 Views Asked by At

As part of a homework, I have a "quantizer" consisting of variables $X_{1}$ and $X_{2}$ which have the following joint distribution.

enter image description here

$X_2$ is discrete and I can assume that all probabilities are uniform.

Now I am supposed to calculate the mutual information $I(X_1;X_2) = h(X_1) + h(X_2) - h(X_1,X_2)$ and interpret the result. $h(X)$ is the differential entropy of $X$.

But according to the book(Elements of Information Theory, p. 229): "The differential entropy of a discrete random variable can be considered to be $-\infty$"

When I plug this into my equation, I get: $I(X_1;X_2) = h(X_1) + (-\infty) - (-\infty)$

How do I interpret that?

1

There are 1 best solutions below

5
On BEST ANSWER

Hint: it's easier to use $I(X_1; X_2) = h(X_1) - h(X_1|X_2)$.

Hint2: $p(X_1|X_2 = 1)$ is a continuous distribution.