Convergence in $L_p$ norm of minimum of uniform random variables, $Y_1=\min\left\{X_1,\ldots,X_n\right\}$ with $X_i\sim U(0,1)$

109 Views Asked by At

${X_i}$ are i.i.d with $X_i\sim U\left(0,1\right)$. Prove that $Y_1=\min\left\{X_1,X_2,\ldots,X_n\right\}$ converges to $Y=0$ in expectation of order $p\ge1$.

My try and where I got stuck:
$$\lim _{n\to \infty }\left(E\left[\left|Y_1-Y\right|^p\right]\right)=\lim _{n\to \infty }\left(E\left[\left|Y_1\right|^p\right]\right)=\lim _{n\to \infty }\left(E\left[\left|\min\left\{X_1,X_2,\ldots,X_n\right\}^p\right|\right]\right)\\ \le \lim _{n\to \infty }\left(E\left[\left|X_i\right|^p\right]\right)\:\forall i\in \left\{1,2,\ldots,n\right\}=\lim _{n\to \infty }\left(\int _0^1\left|X_i\right|^p\:dx\right)\ne 0$$
That is my problem, I don't know how to receive $0$ here

3

There are 3 best solutions below

9
On BEST ANSWER

Your first inequality is not strict enough to get the result. You will need a better estimate of $E\left[\left|Min\left\{X_1,X_2,..,X_n\right\}^p\right|\right]$.

13
On

If $X_i\sim U\left(0,1\right), i=1, \dots, n$ and are independent, we have the following property:

$$Y_1=\min\left\{X_1,X_2,\ldots,X_n\right\} \sim \text{Beta}(a=1,b=n).$$

Hence, as the expectation of any power of a beta distribution is known (see here), we have

$$ \mathbb E\left[\left|Y_1-0\right|^p\right]=\mathbb E\left[Y_1^p\right]=\\ \frac{\Gamma(a+b)\Gamma(a+p)}{\Gamma(a)\Gamma(a+p+b)}= \frac{\Gamma(1+n)\Gamma(1+p)}{\Gamma(1)\Gamma(1+n+p)}= \\ \frac{n!\Gamma(1+p)}{(n+p)((n-1)+p)...(1+p)\Gamma(1+p)}=\color{blue}{\frac{n!}{(n+p)((n-1)+p)...(1+p)}}. \tag {1} $$

For $p=1$, it becomes $\frac{1}{n+1}$, which tends to $0$ as $n \to \infty$.

For $p>1$, from

$$\frac{n!}{(n+p)((n-1)+p)...(1+p)} \le \frac{n!}{n!+ p^n},$$

we see that the last term in (1) tends to zero as $n \to \infty$. This yields the desired result:

$$\mathbb \lim_{n \to \infty } E\left[\left|Y_1-0\right|^p\right]=0$$

for any $p \ge 1$.

4
On

Well here's a short way to do this provided you know about Dominated Convergence Theorem or more generally Uniform Integrability.

So firstly, you show convergence in probability.

So $P(|Y_{n}|\geq\epsilon)=P(\{|X_{1}|\geq\epsilon\},...,\{|X_{n}|\geq\epsilon\})\stackrel{iid}{=}P(|X_{1}|\geq\epsilon)^{n}=(1-\epsilon)^{n}$

which goes to $0$ as $n\to\infty$ ($Y_{n}$ denotes the minimum order statistic).

And $|Y_{n}|\leq 1$ for all $n$. Hence using Dominated Convergence Theorem , you have that $Y_{n}\xrightarrow{L^{p}}0$ .

The uniform integrability theorem basically says that if $X_{n}\xrightarrow{P} X$ and $X_{n}$ is uniformly integrable then $X_{n}\xrightarrow{L^{1}}X$ and vice-versa. You can see my short proof here. So basically, you can even repeat my proof from there and conclude convergence in expectation for any $p$.

In fact, you can generalize this even further. Let $X_{1}$ be non-negative and have $p$-th moment such that for each $\epsilon>0$, there exists $c(\epsilon)>0$ such that $1>P(X_{1}>\epsilon)\geq c(\epsilon)>0$. Then you have $Y_{n}\xrightarrow{P}0$ by the same reasoning. Also, note that $|Y_{n}|\leq |X_{1}|$. So you can conclude using DCT or uniform integrability if $X_{1}$ has $p$-th moment, $|Y_{n}|^{p}$ is uniformly integrable

Here's another way to do it without explicitly computing integrals.

For a postive random variable $X$, you have $E(X^{p})=\int_{0}^{\infty}pt^{p-1}P(X>t)\,dt$ . Now note that $P(Y_{n}>t)=(1-t)^{n}$ for $0\leq t\leq 1$ and $0$ for $t>1$.

Hence you have to show that $\int_{0}^{1}t^{p-1}(1-t)^{n}\,dt\to 0$ as $n\to\infty$ but this is easy by the Dominated Convergence Theorem as the integrand is dominated by $1$ and converges pointwise to $0$.