Given the relative entropy is $\ge 0$:
If I take the relative entropy of $p(x,y)$ and $p(x)$, $H(p(x,y)||p(x))=\sum_{x,y}p(x,y)log\frac{p(x,y)}{p(x)}$, I keep getting $H(X,Y) \le H(X)$, which would mean the conditional entropy must be negative, which for shannon entropies cannot be the case. What am I doing wrong here?
Upon some research, the reason this is happening is because, as mentioned in the comments, the conditional entropy cannot be shown to be non-negative using the relative entropy. All you can do is find an expression for it in terms of $-\sum_{x,y}p(x,y)log\frac{p(x,y)}{p(x)}$, but there is no proof from this approach that it is $\ge 0$ Best explanations I have found are here