I'm trying to write the explicit sum of $D_{KL} \big(p(X | A, B) ~||~ p(X|A)\big)$, where $D_{KL}$ is the Kullback-Leibler divergence (also known as relative entropy).
I know I should be summing over $X$, $A$ and $B$ and I'm not sure I'm doing it correctly:
$$ D_{KL} \big(~p(X | A, B) ~||~ p(X|A)~\big) = \sum_b p(b) ~ \sum_a p(a|b) \sum_x p(x|a,b)\log\frac{p(x|a,b)}{p(x|b)} $$
Is this correct? I tried using the chain rule of relative entropy to simplify the sum, but I only confused myself more, because the left distribution is conditioned by both $A$ and $B$, while the right distribution is only conditioned by $B$.