Expectation of inverse of the sum of positive random variables

156 Views Asked by At

Suppose we have a sequence of independent identically distributed positive random variables $X_1,X_2,\cdots\stackrel{i.i.d.}{\sim}\xi$, and I am puzzled with the existence of expectation $$\mathbb{E}\frac{1}{X_1+\cdots+X_n}.$$ (Here, existence means the expectation is finite.)

For example, if $X_1,X_2,\cdots\stackrel{i.i.d.}{\sim}\xi\stackrel{d}{=}\chi^2(1)$ (chi-square distribution with 1 degree of freedom), then we know that $\mathbb{E}\frac{1}{X_1}=\infty$ and $\mathbb{E}\frac{1}{X_1+X_2}=\infty$, but $\mathbb{E}\frac{1}{X_1+\cdots+X_n}=\frac{1}{n-2}<\infty$ if $n\geq 3$.

My question is can we find a positive population distribution $\xi$, such that for any $n>0$, $X_1,\cdots,X_n\stackrel{i.i.d.}{\sim}\xi$, $$\mathbb{E}\frac{1}{X_1+\cdots+X_n}=\infty?$$

2

There are 2 best solutions below

1
On BEST ANSWER

An alternative solution offered by my colleague, M.:

Let $\phi_X(t)=\mathbb{E} e^{-tX}$ be the Laplace transform of r.v. $X>0$. Since $\int_0^\infty e^{-tx}dt=\frac1x$ for $x>0$, $\mathbb{E}\frac1X=\int_0^\infty \phi_X(t)dt$ by Tonelli's theorem. Therefore, $$ \mathbb{E}\frac1{X_1+\dots+X_n}=\int_0^\infty \phi_{X_1+\dots+X_n}(t)dt =\int_0^\infty \left[\phi_{\xi}(t)\right]^n dt. $$

Now let $Z_k\sim \Gamma(k,1)$, i.e. $Z$ has the density $$ f_{Z_k}(x)=\frac{x^{k-1}e^{-x}}{\Gamma(k)} $$ and the Laplace transform $$ \phi_{Z_k}(t)=\frac1{(1+t)^k}. $$ Now assume that $k$ itself is random, e.g. $k\sim exp(1)$, and let $\xi$ have the distribution of $Z_k$. Then $$ \phi_{\xi}(t)=\int_0^\infty \frac1{(1+t)^k} e^{-k} dk=\frac1{1+\ln(1+t)}\sim \frac1{\ln t} $$ for large $t$, and $$ \int_0^\infty [\phi_{\xi}(t)]^n dt=\infty $$ for all $n\ge 1$.

0
On

I think the answer is "yes". Suppose that $Y_k$, $k=1,2,\dots$, has the Gamma density with parameters $(1/k,1)$, i.e. for $x>0$ $$ f_{Y_k}(x)=\frac{x^{1/k-1}e^{-x}}{\Gamma(1/k)} $$ If you sum $n$ i.i.d. copies of $Y_k$, then its sum $S_{n,k}$ has the density $$ f_{S_{n,k}}(x)=\frac{x^{n/k-1}e^{-x}}{\Gamma(n/k)} $$ and for $n\le k$, $\mathbb{E} \left[\frac 1{S_{n,k}}\right]\ge \int_0^1 \frac{e^{-x}}{x \Gamma(n/k)} dx=\infty$.

Of course, eventually, for large $n$ this will be no longer true. So we construct the distribution of $\xi$ as follows. Let $\xi$ have the distribution of $Y_k$, $k=1,2,\dots,$ with probability $2^{-k}$, so $$ f_{\xi}(x)=\sum_{k=1}^\infty \frac{2^{-k} x^{1/k-1}e^{-x}}{\Gamma(1/k)},\quad x>0. $$

Now, for each fixed $n$, define the event $A$, "all $X_i$, $i=1,2,\dots,n$, have the probability of $Y_{n}$". Then $\mathbb{P}(A)=\left[2^{-n}\right]^n=2^{-n^2}>0$ and $$ \mathbb{E} \frac1{X_1+...+X_n}\ge \mathbb{E} \left[\frac1{X_1+...+X_n}\mid A\right]\mathbb{P}(A)= \mathbb{E} \left[\frac1{S_{n,n}}\right]\mathbb{P}(A)=\infty. $$