If N is a continuous random variable and X a discrete random variable.
How can I calculate H(X|Y) if Y=X+N?
- N is a triangular distribution between -1 and 1
- X can take the values +-0.5 with equal probability
Assuming known: H(X), h(N)=h(Y|X), h(Y) and pdf's of X, N and Y.
All the entropy terms are finite.
It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)
So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence
$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$