Entropy of geometric random variable?

10.3k Views Asked by At

I am wondering how to derive the entropy of a geometric random variable? Or where I can find some proof/derivation? I tried to search online, but seems not much resources is available.

Here is the probability density function of geometric distribution: $(1 - p)^{k-1}\,p$

Here is the entropy of a geometric distribution: $\frac{-(1-p)\log_2 (1-p) - p\log_2 p}{p}$

Where $p$ is the probability for the event to occur during each single experiment.

Thanks a lot.

2

There are 2 best solutions below

3
On BEST ANSWER

Assume $ P(X=k) = (1-p)^{k-1}p $, where $ k \in Z^{+} $, then the entropy is

$$ \begin{aligned} Entropy(X) & = \sum_{k=1}^{+\infty} -(1-p)^{k-1}p \cdot \log_{2}{((1-p)^{k-1}p)} \\ & = -p \cdot \log_{2}{(p)} \sum_{k=1}^{+\infty} (1-p)^{k-1} - p \cdot \log_{2}{(1-p)} \sum_{k=1}^{+\infty} (k-1)(1-p)^{k-1} \\ & = - \log_{2}{(p)} - \frac{(1-p)\log_{2}{(1-p)}}{p} \end{aligned} $$

0
On

Let $ P(X=k) = (1-p)^{k-1}p $, where $ k \in Z^{+} $, then the entropy is given as

$$ \begin{aligned} H(X) & = \sum_{k=1}^{+\infty} -(1-p)^{k-1}p \cdot \log_{2}{((1-p)^{k-1}p)} \\ & = -p \cdot \log_{2}{(p)} \sum_{k=1}^{+\infty} (1-p)^{k-1} - p \cdot \log_{2}{(1-p)} \sum_{k=1}^{+\infty} (k-1)(1-p)^{k-1} \\ \end{aligned} $$

Now, the first series is a straightforward infinite geometric progression sum, the second summation can be evaluated as follows

$$ \begin{aligned} S = \sum_{k=1}^{+\infty} -(1-p)^{k-1}p \cdot \log_{2}{((1-p)^{k-1}p)}\\ S = -p \cdot \log_{2} (1-p) \cdot (0 +(1-p)+2(1-p)^2+3(1-p)^3 ....)\\ (1-p) \cdot S = -p \cdot \log_{2} (1-p) \cdot (0 +(1-p)^2+2(1-p)^3+3(1-p)^4 ....)\\ S -(1-p)S = -p \cdot \log_{2} (1-p) \cdot (0 +(1-p)+(1-p)^2+(1-p)^3 ....)\\ S \cdot p = -p \cdot \log_{2} (1-p) \cdot ( \frac{1-p}{1 -(1-p)}) \\ S = -\log_{2} (1-p) \cdot ( \frac{1-p}{p}) \end{aligned} $$

Now, combining both the summations gives us,

$$\begin{aligned} H(X) = -\log_{2}(p) - (\frac{1-p}{p})\log_{2}(1-p) \end{aligned}$$