Uniformly integrability of inverse of sample moments

374 Views Asked by At

I am interested in the uniform integrability of the set $\{Y_n\}_{n\ge 3}$ where

$$ Y_n = \bigg(\frac{1}{n} \sum_{i=1}^n X_i^k\bigg)^{-1}, $$ where the $X_i$'s are i.i.d observations of a continuous random variable $X$, $k\ge 1$, and $E[X] < \infty$.

Is $\{Y_n\}_{n\ge 3}$ uniformly integrable for $k = 1$ which corresponds to the inverse of the sample mean? What about $k > 1$? In particular, the inverse of the second sample momemt, i.e., $k=2$, is what I am most interested in.

If it is not uniformly integrable in general, is it uniformly integrable if we restrict the allowable $X$?

1

There are 1 best solutions below

16
On BEST ANSWER
  1. Cannot be true in general: consider $k=1$ (or any odd $k$), take $X_i$'s to be i.i.d normal with mean $0$ and variance $1$, we show below that $Y_n$ is not UI.

Note that $$ \mathbb{E}\left(|Y_n| 1\{|Y_n|\geq M\}\right) \geq M\mathbb{P}\left(|Y_n|\geq M\right) = M\mathbb{P}\left(|\bar{X}_n|\leq \frac{1}{M}\right) \to M, $$ as $n\to \infty$ by laws of large numbers ( $\bar{X}_n\to 0$ in probability), now you take the limit as $M\to \infty$ to obtain

$$ \liminf_{M\to \infty }\liminf_{n\to \infty }\mathbb{E}\left(|Y_n| 1\{|Y_n|\geq M\}\right)= \infty $$

  1. Since you are most interested in $k=2$, let us take $k$ is even. Let us show below that an "allowable" $X$ is the following: $X$ is a random variable with $\mathbb{E}(X^{-2k})<\infty$, then $\{Y_n\}$ is UI.

(Note that the conditions in 2 are sufficient but might not be necessary )

Consider the function $\frac{1}{x}$, it is convex on $(0,\infty)$, now using Jensen's inequality you get (I am using $k$ is even to make sure $X_i^k$'s are non-negative, also the inequality below is trivial when any of the $X_i$ is zero) $$ \left(\frac{1}{n}\sum_{i=1}^n X_i^k\right)^{-1} \leq \frac{1}{n}\sum_{i=1}^n X_i^{-k} . $$ Thus $$ \mathbb{E}\left(\left|\left(\frac{1}{n}\sum_{i=1}^n X_i^k\right)^{-1}\right| 1\{\left|\left(\frac{1}{n}\sum_{i=1}^n X_i^{k}\right)^{-1}\right|\geq M\}\right)\\ \leq \frac{1}{n}\sum_{i=1}^n\mathbb{E}\left( X_i^{-k}1\{\left|\left(\frac{1}{n}\sum_{i=1}^n X_i^{k}\right)^{-1}\right|\geq M\}\right) \\ = \mathbb{E}\left( X_1^{-k}1\{\left|\left(\frac{1}{n}\sum_{i=1}^n X_i^{k}\right)^{-1}\right|\geq M\}\right) \leq \mathbb{E}^{1/2}\left( X_1^{-2k}\right)\mathbb{P}^{1/2}\left(\frac{1}{n}\sum_{i=1}^n X_i^{k}\leq \frac{1}{M}\right). $$ In the last step, we use Cauchy–Schwarz inequality. Now $X_i^{-k}$'s are i.i.d random variables with finite second moment. Let us now consider two cases:

Case I: $\mathbb{E}(X_i^{k})<\infty$. Then $\frac{1}{n}\sum_{i=1}^n X_i^{k} \to \mathbb{E}(X^k)$ as $n\to\infty$ in probability by laws of large numbers. Moreover $\mathbb{E}(X^k)>0$ as otherwise $X=0$ with probability $1$, which will violate $\mathbb{E}(X^{-k})<\infty$. Therefore $\frac{1}{n}\sum_{i=1}^n X_i^{k} \to \mathbb{E}(X^k)>0$ and thus choosing $M$ large we can complete the proof.

Case II: $\mathbb{E}(X_i^{k})=\infty$. Here also you can show that for an arbitrary positive constant $c>0$, probability that $\frac{1}{n}\sum_{i=1}^n X_i^{k} >c$ goes to $1$ as $n \to \infty$. Therefore again choose an $M$ large enough to complete the proof.