Kullback-Leibler divergence of binomial distributions

2.3k Views Asked by At

Suppose $P \sim \mathrm{Bin}(n,p)$ and $Q \sim \mathrm{Bin}(n,q)$. Their Kullback-Leibler divergence is defined by $$D_{KL}(P||Q)=\mathbb{E}_{P}\left[\log\left(\frac{p(x)}{q(x)}\right)\right],$$ with $p(x)$ and $q(x)$ the pdf of $P$ and $Q$ resp. and the expected valued is for $P$. For the case I give above, I can write out the expected values and can also find an answer given by: $$\log\left(\left(\frac{p}{q}\right)^{np}\right)+\log\left(\left(\frac{1-p}{1-q}\right)^{n-np}\right).$$

But now i have to compute it for $P \sim \mathrm{Bin}(n,p)$ and $Q \sim \mathrm{Bin}(n+1,q)$ en $P \sim \mathrm{Bin}(n,p)$ and $Q \sim \mathrm{Bin}(n-1,q)$.

But then I get problems with the formulas and the expected values. Can someone help we to get the good answer? Thank you