Mean of $\log k!$ for Binomial distribution

80 Views Asked by At

Is there a nice result\approximation\bounds for the mean of $g(k)=\log(k!)$ under Binomial distribution? That is, for the sum $$ \sum_{k=0}^m {m\choose k}p^k(1-p)^{m-k}\log(k!)$$ My ultimate goal is to calculate the KL- or $\xi^2$-divergence between a Bin$(m,p)$ distribution and a Laplace$(\lambda)$ distribution.

1

There are 1 best solutions below

0
On BEST ANSWER

The Binomial has mean and variance $\mu_X=mp$ , $\sigma_X^2=m p(1-p)$

A simple approximation is given by $E[g(X)] \approx g(\mu_X) + \frac12 g''(\mu_X)\sigma^2_X$.

Here, for fixed $p$ and $m \to \infty$, using the Stirling approximation, we get:

$$E[\log(X!)]\approx \log(\mu_X !) + \frac{\sigma^2_X}{2 \mu_X} = \log(\mu_X!)+ \frac{(1-p)}{2} $$

Plugging again the Stirling approximation, this is approximately

$$ \left( mp +\frac12\right)\log(mp) - mp + \frac{\log(2\pi) + (1-p)}{2}\approx mp\log mp$$