Relative entropy of marginals of a distribution

131 Views Asked by At

Suppose we have a correlated probability distribution of heads and tails, $$P=\{p,0,0,1-p\}=\{HH,HT,TH,TT\},$$ where the marginal probability distribution for heads or tails is $\{p,1-p\}$.

Denote $Q= \{p,1-p\} \otimes \{p,1-p\}$ as the probability distribution generated by the marginals. Normally we would calculate the mutual information as $D(P\mathrel{\|}Q)$. However I'm interested in finding $D(Q\mathrel{\|}P)$ for which I get:

$$D(Q\mathrel{\|}P) = p^2 \log\dfrac{p^2}p + 2p(1-p)\log\dfrac{p(1-p)}0 + (1-p)^2\log\dfrac{(1-p)^2}{1-p} = \infty.$$

As this is non-finite for all values of $p\neq \{0,1\}$, is this correct?

1

There are 1 best solutions below

0
On

Yes. As per Wikipedia (adapting notation):

$D(Q\mathrel{\|}P)$ is defined only if for all $i$, $P(i) = 0 \implies Q(i)=0$

That condition is not verified here, since $P(i)$ (the real joint density) is zero at some values where $Q(i)$ (the product of the marginals) is non-zero.

In such cases, you can say $D(Q\mathrel{\|}P)$ does not exist, or that it diverges (to $+\infty$, as $D(Q\mathrel{\|}P)$ is non-negative).