Entropy of a beta-binomial compound distribution

131 Views Asked by At

I am trying to calculate the entropy of a beta-Bernoulli process, which is a special case of the beta-binomial process for when $n=1$.

I believe that the correct approach is to take the Beta-Binomial PMF (with $n=1$):

$$ P(k \mid 1,\alpha ,\beta )= {1 \choose k}{\frac {{\mathrm {B}}(k+\alpha ,1-k+\beta )}{{\mathrm {B}}(\alpha ,\beta )}}\! $$ where $\text{B}(\cdot)$ is the Beta function, plug it into the Boltzmann-Shannon entropy.


Here is how far I have got, reminding ourselves of the model:

$$ X\sim \operatorname {Bin} (n,p) $$ then $$ P(X=k \mid p,n)=L(p|k)={n \choose k}p^{k}(1-p)^{n-k} $$ with $n=1$ we get $$ P(X=k \mid p,1)=L(p \mid k)={1 \choose k}p^{k}(1-p)^{1-k} $$ so we are saying that $X$ is defined on a binary space $\{0,1 \}$ also $$ {\binom {n}{k}}={\frac {n!}{k!(n-k)!}} = /n=1 / = {\frac {1!}{k!(1-k)!}} $$

Recall also that entropy is defined as:

$$ \mathrm{H} (X) =\mathbb {E} [-\log(\mathrm {P} (X))] $$


Lets plug in our PMF expression (defined at the top) for the Beta-Binomial, into the definition for entropy:

$$ \mathrm{H} [X = k] = \mathbb{E} \left [ - \log{\left (\frac{{\binom{1}{k}}}{\mathrm{B}{\left (\alpha,\beta \right )}} \mathrm{B}{\left (\alpha + k,\beta - k + 1 \right )} \right )} \right] $$ which simplifies to $$ \begin{align} \mathrm{H} [X = k] &= \mathbb{E} \left [ \log{\mathrm{B}{\left (\alpha,\beta \right )}} - \log \mathrm{B}{\left (\alpha + k,\beta - k + 1 \right )} - \log{{\binom{1}{k}}} \right ] \\ &= \mathbb{E}\left [\log{\mathrm{B}{\left (\alpha,\beta \right )}}\right ] - \mathbb{E} \left[\log \mathrm{B}{\left (\alpha + k,\beta - k + 1 \right )}\right ] - \mathbb{E} \left [\log{{\binom{1}{k}}} \right]. \end{align} $$

Which reduces to:

$$ \begin{equation} \mathrm{H} [X =k] = \log{\mathrm{B}{\left (\alpha,\beta \right )}} - \psi(\alpha+k) + \psi(\alpha + \beta + 1) - \mathbb{E} \left [\log{{\binom{1}{k}}} \right]. \end{equation} $$

where $\psi(\cdot)$ is the digamma function. The problem is now the last expectation:

$$ \mathbb{E} \left [\log{{\binom{1}{k}}} \right] $$

Not sure if this makes sense; how can one take the expectation of a binomial coefficient? I feel like I have gone wrong somewhere.

2

There are 2 best solutions below

0
On

$$ \mathbb{E} \left [\log \frac{1}{k} \right] = \left [\log \frac{1}{k} \right] = \log 1 = 0. $$

0
On

Be aware that when you turn a Binomial (n, p) r.v in a Bernoulli (1, p) r.v. then the space of possible values (number of successes in $n$ independent trials with probability of success $p$) changes from (0, 1, .... , n) to (0, 1) hence $P(X=0)=1-p$ and $P(X=1)=p$.

The p.m.f of the Beta-Bernoulli $(1, \alpha, \beta)$ r.v. will be

$$P(X=0)= \frac{\beta}{\alpha+\beta}$$ $$P(X=1)= \frac{\alpha}{(\alpha+\beta)}$$

Hence the correct answer to your question turns out to be:

$$H(X)= -\left\{\frac{\beta}{\alpha+\beta} \log \left[\frac{\beta}{\alpha+\beta}\right] + \frac{\alpha}{(\alpha+\beta)} \log \left[\frac{\alpha}{(\alpha+\beta)}\right]\right\}= -\left\{\frac{\beta}{\alpha+\beta} \log (\beta) + \frac{\alpha}{(\alpha+\beta)} \log (\alpha)\right\}$$