Assume $X_1,X_2, \dots , X_n$ following the exponential distribution with mean $\theta > 0 $ and the statistic: $$ T = \sum\limits_{i = 1}^n X_i $$
I know that the sum of exponential random variables follows the Gamma distribution, but I cannot infer anything about the inverse of the sum $\frac{1}{T}$. My guess would be that: $$ E\left[\frac{1}{T}\right] = \frac{1}{\theta} \quad \text{and} \quad V\left[ \frac{1}{T} \right] = \frac{1}{\theta^2} $$ but why?
In general, if $\sf T$ has a continuous distribution with p.d.f. $\sf{f_T(t)}$ and c.d.f. $\sf{F_T(t)}$, the c.d.f. of $\sf{S=T^{-1}}$ is given by $$\sf{F_S(s)=P(S\le s)=P\left(T\ge\frac1s\right)}=1-F_T\left(\frac1s\right)$$ so that the p.d.f. of $\sf S$ is $$\sf{f_S(s)=F'_S(s)=\frac1{s^2}f_T\left(\frac1s\right)}.$$ The mean and variance can be derived by integrating over $\sf{\Omega_S}$ with the integrand consisting of $\sf{f_S(s)}$. That is, $$\sf{E(S)=\int_{\Omega_S}sf_S(s)\,ds,\quad V(S)=\int_{\Omega_S}s^2f_S(s)\,ds}-\left[\int_{\Omega_S}sf_S(s)\,ds\right]^2.$$ This can be applied to $\sf{T\sim Ga(n,\lambda)}$.