Evaluate $E(1/X)$ for rv $X$ of the negative binomial distribution.

1.2k Views Asked by At

Let $X$ be a random variable of the negative binomial distribution, of parameters $r\in \mathbb{N},~p\in (0,1)$. Evaluate $E\left(\dfrac{1}{X}\right)$.

Attempt. Of course $$E\left(\dfrac{1}{X}\right)=\sum_{k=r}^{\infty}\frac{1}{k} P(X=k)$$ but the substitution of the pmf of $X$ seems to make the calculation difficult. Am I on the right path, or should we work on X being the sum of independent geometric rvs?

Thanks for the help.


*Edit. Writing down the calculations, we would go like:

$$E\left(\frac{1}{X}\right)=\sum_{x=r}^{\infty}\frac{1}{x}\binom{x−1}{r-1}p^r(1-p)^{x-r},$$

where the term $\frac{1}{x}$ seems not to be absorbed by the binomial coefficient $\binom{x−1}{r-1}=\frac{(x-1)!}{(r-1)!(x-r)!}$.

3

There are 3 best solutions below

1
On BEST ANSWER

Hint: By differentiating the equation with respect to $p$ we obtain:$${dE\{{1\over X}\}\over dp}=r\sum_{x=r}^{\infty}\frac{1}{x}\binom{x−1}{r-1}p^{r-1}(1-p)^{x-r}-\sum_{x=r}^{\infty}(x-r)\frac{1}{x}\binom{x−1}{r-1}p^r(1-p)^{x-r-1}$$by using the definition of $E\{{1\over X}\}$ and plugging it in the above equation we can conclude:$${dE\{{1\over X}\}\over dp}={r\over p}E\{{1\over X}\}-{1\over 1-p}+{r\over 1-p}E\{{1\over X}\}$$can you finish now?

0
On

Our problem is to find \begin{align*} E[1/X] &= \sum_{k = r}^\infty \frac{1}{k}\binom{k-1}{r-1}p^r(1-p)^{k-r}, \\ &= \frac{p^r}{(1-p)^r}\sum_{k = r}^\infty \frac{1}{k}\binom{k-1}{r-1} (1-p)^{k}. \end{align*}

We will need the following fact

$$ x\binom{x-1}{r-1} = r\binom{x}{r},$$

which suggests \begin{align*} E[1/X] &= \frac{rp^r}{(1-p)^r}\sum_{k = r}^\infty \frac{1}{k^2}\binom{k}{r} (1-p)^{k},\\ &= \frac{rp^r}{(1-p)^r} \frac{(1-p)^r {}_2 F_1 \left(r, r; r + 1; 1-p \right) }{r^2},\\ &= \frac{p^r{}_2 F_1 \left(r, r; r + 1; 1-p \right) }{r}, \end{align*} where $${}_2 F_1 \left(a, b; c; z \right) := \sum_{n=0}^\infty \frac{(a)_n (b)_n}{(c)_n}\frac{z^n}{n!}, $$ is Gauss's hypergeometric function and $(a)_n$ is a Pochhammer symbol. Here's an R implementation that shows correctness:

Gauss2F1 <- function(a,b,c,x){
  ## taken from https://stats.stackexchange.com/questions/33451/computation-of-hypergeometric-function-in-r
  require(gsl)
  if(x>=0 & x<1){
    hyperg_2F1(a,b,c,x)
  }else{
    hyperg_2F1(a,c-b,c,1-1/(1-x))/(1-x)^a
  }
}
####  
r <- 10
p <- .15

Xp <- rnbinom(1e6, size = r, p = p) + r ## X1 = X2 + r, see https://www.johndcook.com/negative_binomial.pdf
mean(1/(Xp))
p^r/r * Gauss2F1(r, r, r + 1, 1-p)
0
On

The task is to find $\mathbb{E}\big[\frac{1}{X}\big]$ for $X \sim \mathcal{NB}(r,p).$ One formulation that leads to a solvable integral uses the fact that: $$ \mathbb{E}\bigg[\frac{1}{X}\bigg] = \int_{0}^\infty M_X(\text{-}t) dt. $$ This can be seen to hold for all $\alpha > 0$ as: $$ \int_0^\infty e^{\text{-}\alpha x} dx = \frac{1}{\alpha}$$ and \begin{align}\mathbb{E}\bigg[\frac{1}{X}\bigg] &= \int_0^\infty \frac{1}{\alpha} f(\alpha) d\alpha \\ &= \int_0^\infty \bigg(\int_0^\infty e^{\text{-}\alpha x} f(\alpha)d\alpha\bigg)dx \\ &= \int_0^\infty M_X(\text{-}t) dt. \end{align} As a negative binomial rv is defined as the sum of $r$ iid geometrix rvs, we know that $\big($using the property that for iid $X_i, M_{\sum_{i=1}^n X_i}(t) = \prod_{i=1}^n M_{X_i}(t) = \big[M_{X_i}(t)\big]^n\big)$:

$$ Y\sim \mathcal{G}(p) \implies M_Y(t) = \frac{pe^t}{1-qe^t}.$$

Therefore,

$$ X \sim \mathcal{NB} \implies M_X(t) = \bigg(\frac{pe^t}{1-qe^t}\bigg)^r.$$

Using the trick for $\mathbb{E}\big[\frac{1}{X}\big]$, we can see that:

$$ \mathbb{E}\bigg[\frac{1}{X}\bigg] = \int_0^{\infty} \bigg(\frac{pe^{\text{-}t}}{1-qe^{\text{-}t}}\bigg)^r dt,$$

which is solvable with traditional integration methods.