Tail of a sequence of RV

98 Views Asked by At

Consider the sequence of random variables $\{X_n\}_{n=1,2,\dotsc}$ where $X_n$ is gamma-distributed with shape $n$ and scale $1/n$ (or equivalently, $2nX_n$ is $\chi^2$-distributed with $2n$ degrees of freedom). That is, $X_n$ has the p.d.f. $$ p_n(x) = \begin{cases} \frac{n^n}{(n-1)!} x^{n-1} e^{-nx} & \text{for $x \geq 0$} \\ 0 & \text{for $x < 0$} \end{cases} $$ Now, we know that the mean and variance are respectively $$ \mathsf{E}[X_n] = 1 \qquad \text{and} \qquad \mathsf{var}(X_n) = \frac{1}{n} $$ so the sequence tends to $1$ in probability. I want to prove that for any positive $\epsilon>0$, when I split the expectation integral as $$\begin{align} 1 &= \mathsf{E}[X_n] \\ &= \mathsf{E}\left[X_n\middle|X_n \leq 1+\epsilon\right] \mathsf{Pr}\left\{X_n \leq 1+\epsilon\right\} + \mathsf{E}\left[X_n\middle|X_n > 1+\epsilon\right] \mathsf{Pr}\left\{X_n > 1+\epsilon\right\} \\ &= \int_0^{1+\epsilon} x p_n(x) \mathrm{d}x + \int_{1+\epsilon}^\infty x p_n(x) \mathrm{d}x \end{align}$$ then the first integral tends to $1$ while the second integral tends to $0$ as $n \to \infty$. The intuition is that there are very few "outliers" as $n \to \infty$ so that the expectation of $X_n$ conditioned on $X_n > 1+\epsilon$ either remains bounded, or at most, goes to infinity slower than $\mathsf{Pr}\left\{X_n > 1+\epsilon\right\}$ goes to zero. Note that by Chebyshev's inequality, $$ \mathsf{Pr}\left\{\left|X_n-1\right| > \epsilon \right\} \leq \frac{1}{n\epsilon^2} $$ And if this is true, does it also hold in more generality? Note that $X_n$ can be represented as the mean of $n$ independent unit-mean exponentially distributed variables. Can we thus formulate a more general statement (not limited only to exponentially distributed variables)?

Note: this is a probabilistic rephrasing of the question Prove that $\lim\limits_{x\to\infty} \frac{\Gamma(x+1,x(1+\epsilon))}{x\Gamma(x)}=0$. which I posted a few days ago, and which remained unanswered so far, perhaps because it was stripped from its physical meaning.

1

There are 1 best solutions below

3
On BEST ANSWER

By Cauchy-Schwarz inequality, $$\int_{1+\epsilon}^\infty xp_n(x) \mathrm{d}x=E(X_n;X_n\gt1+\epsilon)\leqslant\sqrt{E(X_n^2)\cdot P(X_n\gt1+\epsilon)}. $$ By definition, $$ E(X_n^2)=E(X_n)^2+\mathrm{var}(X_n)=1+\mathrm{var}(X_n). $$ By Bienaymé-Chebychev inequality, $$ P(X_n\gt1+\epsilon)\leqslant P(|X_n-E(X_n)|\gt\epsilon)\leqslant\frac{\mathrm{var}(X_n)}{\epsilon^2}. $$ Thus, $$ \int_{1+\epsilon}^\infty xp_n(x) \mathrm{d}x\leqslant u_\epsilon(\mathrm{var}(X_n)),\qquad u_\epsilon(t)=\frac1\epsilon\sqrt{t(1+t)}. $$ For every $\epsilon\gt0$, $u_\epsilon(t)\to0$ when $t\to0$ hence the result follows.