Entropy of geometric random variable with parameter $1/2$

855 Views Asked by At

I'm studing for an information theory exam, maybe some of you can help me here with an exercise.

What's the entropy of $X$ as $\{1,2,\ldots,n\}$ ($n$=infinity) where the probabilities are $P \{1/2^1, 1/2^2,\ldots, 1/2^n\}$?

The question is multiple choice and gives 4 possible answers: 1. $2 \over 3$ bits/symbol; 2. $1 \over 2$ bits/symbol; 3. $\infty$ bits/symbol; 4. none of the above;

So far i got: $$ H(X) = - \sum_{i=1}^{n} P(x_i) \cdot\log_2( P(x_i)) $$

So in this case, $$ H(X) = - \sum_{i=1}^{\infty} {1 \over 2^i} \cdot\log_2\left({1 \over 2^i}\right) $$

$\log_2(1/x) = -\log_2(x)$, while $x>0$, so,

$$ H(X) = - \sum_{i=1}^{\infty} {1 \over 2^i}\cdot(-i) $$

I also know that:

$$ \sum_{i=1}^{\infty} a \cdot r^{-i} = {a \over r-1} $$

But in this case I think 'a' must be a constant, right?

Wolfram Alpha gives me H(X) = 2 bits/symbol as the result: bit.ly/nbQwgV

It is correct? Any hint?

Greatly apreciated. Cheers.

1

There are 1 best solutions below

3
On BEST ANSWER

Here is a correct version of the question:

What is the entropy of $X$ such that $P(X=i)=1/2^i$ for every positive integer $i$?

You need to know the value of the series $t(x)=\sum\limits_{i\geqslant0}ix^i$. You probably already know the value of the series $s(x)=\sum\limits_{i\geqslant0}x^i$ since $s(x)$ is the classical geometric series: $s(x)=\frac1{1-x}$.

But $t(x)=xs'(x)$ hence $t(x)=\frac{x}{(1-x)^2}$.

In your case, you start at $i=1$ instead of $i=0$ but this does not change anything since the $0$th term of $t(x)$ is $0$, and you choose $x=\frac12$. Hence $t\left(\frac12\right)=\frac{\frac12}{(1-\frac12)^2}=2$, as desired.