Moments of minimum of random variables

661 Views Asked by At

Let $\mu$ be a non-atomic probability measure on $[0,\infty)$ and sample $X_1,X_2$ from $\mu$ independently. Does $\min(X_1,X_2)$ have twice as many moments as $X_1$? Is the quantity $$ \frac{\mathbb E\min(X_1,X_2)}{\left(\mathbb E \sqrt{X_1}\right)^2} $$ bounded away from $0$ and $\infty$?

More generally, does $\min(X_1,\ldots,X_n)$ have $n$ times as many moments as $X_1$? Moreover is $$ \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X_1}\right)^n} $$ bounded away from $0$ and $\infty$?

For nice distributions, the identity $\mathbb E X=\int \mathbb P(X>x)\; d\mu(x)$ allows us to reformulate the general versions as follows: $$ \|\mathbb P(X_1>t)\|_n\approx\|\mathbb P(X_1>t^n)\|_1, $$ where $\approx$ means bounded by constants.

3

There are 3 best solutions below

0
On BEST ANSWER

Yes - $m_n$ has at least $n$ times as many moments as $X$. $E(g(X))=\int_0^\infty g'(t)P(X>t)\,dt$, hence $$E(X^{k})=\int_0^\infty k t^{k-1} P(X>t)\,dt\tag 1$$ while $$E(m_n^{nk})=\int_0^\infty nk t^{nk-1}P^n(X>t)\,dt\tag 2$$

If $(1)$ converges then $h(t)=t^kP(X>t)=o(1)$ which implies $h^n=O(h)$ which implies $(2)$ converges.

Your "i.e." doesn't apply though, since $E(m_n)\to \min \operatorname{supp}(\mu)$ while $(EX^{1/n})^n\to\exp(E\log X)$. Their ratio is bounded away from $\infty$ but not $0$. The most interesting case is $E\log X=-\infty$ (which implies $0\in \operatorname{supp}(\mu)$)

1
On

If the $X_k$ are exponential with parameter $\lambda>0$, then $$ {\Bbb E[\min(X_1,X_2)]\over\Bbb E\sqrt{X_1}}={\sqrt{\pi}\over 2\sqrt{\lambda}}. $$ With thoughts of scaling in my head, let me suggest that perhaps you want to consider$$ {\Bbb E[\min(X_1,X_2)]\over\left[\Bbb E\sqrt{X_1}\right]^2} $$

4
On

Let $X>0$ be a random variable and consider an i.i.d. sequence $X_1,\ldots,X_n$ sampled from $X$.

The following inequality holds and the constants are optimal as $n\to\infty$: $$ 0\leq \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X}\right)^n}\leq 1 $$

Proof: Note that $\min(X_1,\ldots,X_n)\leq \sqrt[n]{X_1\cdots X_n}$. Taking expectations and applying independence yields the upper bound. The upper bound cannot be improved, as one sees by letting $X$ tend to a deterministic random variable.

The lower bound cannot be improved: consider the case $X\sim U[0,1/m]$. Note that $$ \mathbb E\sqrt[n]{X}=m\int_0^{1/m}x^{1/n}\; dx=\frac{m^{-1/n}}{1+\frac{1}{n}} \geq (me)^{-1/n}, $$ where we have used the Taylor inequality $e^{1/n}\geq 1+1/n$.

Moreover, $\mathbb E\min(X_1,\ldots,X_n)=1/m(n+1)$ . Therefore $$ \frac{\mathbb E\min(X_1,\ldots,X_n)}{\left(\mathbb E \sqrt[n]{X}\right)^n}\leq \frac{1/m(n+1)}{(me)^{-1}}=\frac{e}{n+1}\to 0. $$ Thus we have shown that as $n\to\infty$, there exists a sequence of random variables $U[0,1/m]$ for which the ratio is arbitrarily small.

Note: For fixed $n$, the best lower bound is unknown. For $n=2$ I can get $1/2$ using a pole at $0$ blowing up like $x^{\epsilon-1}$. Can anyone do better?