Let $X_1, X_2, \ldots, X_n$ be i.i.d. Bernoulli random variables with $P(X_i = 1) = p$ for every $i \in \{1, \ldots, n\}$. Let $Y = \sum_{i=1}^n X_i$ and let $c$ be a positive number.
I am interested in computing $\mathbb{E}\left[\frac{1}{c + Y}\right]$. I know that the expected value can be lower-bounded using Jensen's inequality: $\mathbb{E}\left[\frac{1}{c + Y}\right] \geq \frac{1}{\mathbb{E}\left[c + Y\right]} = \frac{1}{c + pn}$.
But is it possible to compute $\mathbb{E}\left[\frac{1}{c + Y}\right]$ exactly?
This answer seems to be very relevant, but I am not sure I understand how to correctly extend it to the above case. I would be grateful for any hints.
Let's attack this using the nice linked answer.
Here we have
$$\mathbb{E}\left(\frac{1}{c+X_1+\cdots+X_n}\right)=\int_0^\infty e^{-tc} \, \mathbb{E}\left(\exp\left(-t X \right) \right)^n \mathrm{d}t \tag{1}$$
and $\mathbb{E}\left(\exp\left(-t X \right) \right)=p e^{-t} + q$ with $q=1-p$. Hence the integrand in $(1)$ equals
$$ e^{-tc} (p e^{-t} + q)^n=e^{-tc} \sum_{k=0}^n p^k e^{-kt}q^{n-k} \binom{n}{k} \tag{2}$$
Then the expectation is
$$ \sum_{k=0}^n p^k q^{n-k} \binom{n}{k} \int_0^\infty e^{-t(c+k)} dt = \sum_{k=0}^n p^k q^{n-k} \binom{n}{k} \frac{1}{c+k} \tag{3}$$
I'm afraid you cannot simplify this further.
Notice that this could be obtained by a much simpler approach: just write down the expectation of $\frac{1}{c+Y}$,where $Y=X_1+\cdots+X_n$ is a Binomial $(n,p)$
Update (inspired by this)
An alternative expression, can be obtained by noticing that our expectation as in $(3)$ (let's call it $G$) can be expressed as
$$G=\int_0^1 (p x+q)^n x^{c-1} dx \tag{4}$$
or , doing $u=px+q$:
$$G= \frac{1}{p^c}\int_q^1 u^n (u-q)^{c-1} du \tag{6}$$
Further, if $c$ is a positive integer :
$$G= \left(\frac{q}{p}\right)^c \sum_{j=1}^c \frac{(-1)^{c-j}}{n+j} \binom{c-1}{j-1} \left( q^{-j} - q^{n}\right) \tag{7}$$
This might be more convenient than $(3)$ for $c \ll n$