Lower-bound for $E(\log((x+1)!))$?

101 Views Asked by At

I have this random discrete variable $X$ such that $E(X)=\mu$. Is there a closed form for the value of $E(\log((x+1)!))$? If not, is there a lower bound for it, for example using Jensen inequality? I do not know how $X$ is distributed.

1

There are 1 best solutions below

2
On

Assuming the random variable $x$ is always an non-negative integer and $E[x] =\mu$,

$f(x) = \log( (x+1)!)$ is convex, so $E[f(X)] \geq f(\mu)$ by Jensen.

(It's not to hard to see that $f:Z\rightarrow R$ is convex. Note that $$f(x) =\log((x+1)!) = \log(x+1) + \log(x!) = \log(x+1) + f(x-1).$$ So, $g(x) = f(x) -f(x-1) = \log(x+1)$ is strictly increasing which implies convexity.)

Furthermore, this lower bound is sharp because the distribution could be $P(x=n) = 1$ for some positive integer $n$ in which case,

$E[f(x)] = f(n) = f(\mu)$.

If you only know the mean of the distribution and nothing else, then there is no upper bound on $E[f(x)]$.

For example, if you look at the distributions where $P(x=0) = 1- 1/n$ and $P(x=n) = 1/n$ where $n$ is a positive integer, then $\mu = 1$ and $$E[f(x)] = \frac{1}{n} \log((1+n)!)$$ $$ \geq \frac{1}{n} \log\left( \left(\frac{n+1}{e}\right)^{n+1}\right)$$ $$ > \frac{1}{n} \log\left( \left(\frac{n}{e}\right)^{n}\right)$$ $$ > \log\left( \frac{n}{e}\right).$$ (Using Sterling for the lower bound on $(n+1)!$.)

That shows that $E[f(x)]$ can be arbitrarily large if the only thing you know is $E[f(x)]$.

If you know the standard deviation of $x$, then you can bound $E[f(x)]$ with Taylor series. The first derivative of $f(x)$ is the polygamma function of order 0 evaluated at $x+2$.
$$f'(x) = \psi(x+2)= \frac{\Gamma'(x+2)}{\Gamma(x+2)}.$$

The second derivative is the polygamma function of order 1 evaluated at $x+2$. $$f''(x) = \psi^{(1)}(x+2).$$

The polygamma function of order 1 is strictly decreasing and positive for positive real numbers, so $$\max_{x\geq0} f''(x)= f''(0) = \psi^{(1)}(2)= \frac{\pi ^2}{6}-1< 13/20.$$

Taylor series implies for any function $f:[0,\infty)\rightarrow R$, $$f(x) \leq f(\mu) + f'(\mu)(x-\mu) + \max_{z\geq0}f''(z)(x-\mu)^2/2,\mathrm{\ so}$$ $$E[f(x)] \leq f(\mu) + \left(\frac{\pi ^2}{6}-1\right)\sigma^2/2$$ where $\sigma^2 = E[(x-\mu)^2].$

If you can bound $x$, you can tighten the bounds on $E[f(x)]$ using similar techniques. If $b_1\leq x \leq b_2$, then $$f(\mu) +\psi^{(1)}(b_2+2) \sigma^2/2\leq E[f(x)] \leq f(\mu) +\psi^{(1)}(b_1+2) \sigma^2/2.$$

For example, if $P(x=10) = 1/2 = P(x=12)$, then $\mu = 11$, $b_1=10$, $b_2=12$, $\sigma=1$,
$$E[f(x)] = (\log(11!) + \log(13!))/2 \approx 20.027$$ $$f(\mu) = f(11) \approx 19.987$$ $$f(\mu) +\psi^{(1)}(b_2+2) \sigma^2/2 \approx 20.024$$ $$f(\mu) +\psi^{(1)}(b_1+2) \sigma^2/2 \approx 20.031$$