Bounding fractional moments of geometric random variable

70 Views Asked by At

The following two bounds for a fractional moment of a geometric random variable $X$ with $\mathbb{P}\left[X = k\right] = p \left(1 - p\right)^k$ where $k \geq 0$ are given in this paper (on page 12):

Let $s \in \left(0, 1\right)$, then

  • $\mathbb{E}\left[ X^s \right] \leq C_1\left(s\right) p^{-s} \left(1 - p\right)$, and
  • $\mathbb{E}\left[ X^{-s} \mathbf{1}_{\left\{X > 0\right\}} \right] \leq C_2\left(s\right) p^s \left(1 - p\right)$

where $C_1$ and $C_2$ are constants depending only on $s$ (these inequalities can be expressed in terms of the polylogarithm, see below).

I would appreciate any help in showing that the 2nd bound is true. I do know how to prove the first bound, but the same method fails for the 2nd. You can show the first one by noting that $f\left(s\right) = \mathbb{E}\left[ \left(p X\right)^s \right]$ is convex and continuous in $s$, and can be continued to the interval $s \in \left[0, 1\right]$ with $f\left(0\right) = f\left(1\right) = 1-p$. By convexity, $f\left(s\right) \leq 1 - p$ for all $s \in \left(0, 1\right)$, which proves the bound with $C_1\left(s\right) = 1$.

Trying the same method on the 2nd bound yields the convex and continuous function $g\left(s\right) = \mathbb{E}\left[ \left(pX\right)^{-s} \mathbf{1}_{\left\{X > 0\right\}} \right]$ with $g\left(0\right) = 1-p$, but $g\left(1\right) = - \ln\left(p\right)$ (see this post), so we cannot get a constant $C_2$ which is independent of $p$ in this way.

Note that $\mathbb{E}\left[ X^s \right] = \sum_{k \geq 0} k^s p\left(1-p\right)^k = p \cdot \sum_{k \geq 1} \frac{\left(1-p\right)^k}{k^{-s}} = p \cdot \textrm{Li}_{-s}\left(1-p\right)$ where $\textrm{Li}$ denotes the polylogarithm, and $\mathbb{E}\left[ X^{-s} \mathbf{1}_{\left\{X > 0\right\}} \right] = p \cdot \textrm{Li}_{s}\left(1-p\right)$. So useful references on the behavior of the polylogarithm would also be appreciated.

Looking at plots on WolframAlpha (I plotted $\frac{p^{1-s}}{1-p} \textrm{Li}_{s}\left(1-p\right)$, which should be smaller than $C_2\left(s\right)$) shows that the critical region for the bound to be true is $p \to 0$, and indeed calculating the limit on WolframAlpha confirms that such a constant $C_2\left(s\right)$ should exist, and also that the behavior of the polylogarithm seems to be better understood than what I could get from my internet researches.

1

There are 1 best solutions below

0
On BEST ANSWER

One way of showing the second bound is the following:

Let $Y \sim \textrm{Geo}(p)$ be a random variable with geometric distribution of parameter $p$, but which takes values in $\mathbb{N} \setminus \left\{0\right\}$, so $\mathbb{P}\left[Y = k\right] = q^{k-1} p$ where $k \geq 1$ and $q = 1 - p$. Then we can write \begin{align} \label{eq:scnd_sum_bound} q \cdot \mathbb{E}\left[Y^{-s}\right] = q \cdot \sum_{k \geq 1} k^{-s} q^{k - 1} p = \sum_{k \geq 1} k^{-s} q^k p = \mathbb{E}\left[X^{-s} \mathbf{1}_{\left\{X > 0 \right\}} \right] &\leq C_2\left(s\right) qp^s \\ \iff \mathbb{E}\left[\left(pY\right)^{-s}\right] &\leq C_2\left(s\right) \end{align} Now, $g\left(p\right) = \mathbb{E}\left[\left(pY\right)^{-s}\right]$ is decreasing in $p$ (to see this, note that $\mathbb{E}\left[\left(pY\right)^{-s}\right] = \frac{p^{1 - s}}{1 - p} \cdot \textrm{Li}_s\left(1 - p\right)$, derive: https://bit.ly/3BOmLjP, and analyze the derivative: https://bit.ly/3RVn5CA), so it is sufficient to show that $\lim_{p \to 0} g\left(p\right) =: C_2\left(s\right)$ exists and is finite. Note that for $p \to 0$, $pY$ converges in distribution to a random variable $Z$ which is $\textrm{Exp}\left(1\right)$-distributed (we have that $\mathbb{P}\left[pY > t\right] \to e^{-t}$ for $p \to 0$). Therefore, $\lim_{p \to 0} g\left(p\right) = \mathbb{E}\left[Z^{-s}\right] = \Gamma\left(1 - s\right) < \infty$ for $s \in \left(0, 1\right)$. Thus, we can choose $C_2\left(s\right) = \Gamma\left(1 - s\right)$.