Variance of sample minimum

1.3k Views Asked by At

$X_1,X_2,\ldots$ is an iid sequence of random variables with the property that $P\{X_1 \geq 0\} = 1$.

I want to show that $\mathrm{Var}\left(\min(X_1,X_2,\ldots,X_n)\right)$ decreases with increasing $n$.

Intuition: Since $X_1,X_2,\ldots$ are bounded from below, $\min(X_1,X_2,\ldots,X_n)$ should converge to $\inf_{\omega}X_1(\omega)$ surely. So then the variance should tend to zero.

$y \mapsto F(y)$ denoting the distribution function of $X_1$, I can write

$$\mathrm{Var}\left(\min(X_1,X_2,\ldots,X_n)\right) = \int_0^\infty 2y(1-F(y)^n)\,dy - \left(\int_0^\infty (1-F(y)^n)\,dy\right)^2$$

I replaced $n$ by real-valued $x$ and considered the derivative of the resulting expression with respect to $x$. That did not help. Is there a way to show what I want to show without making assumptions on $F$?

2

There are 2 best solutions below

0
On

Denote $Y \equiv \min(X_1,X_2,\ldots,X_n)$, then \begin{align} P( Y > y) &= \prod_i^n P( X_i > y) \end{align} For any $y$ such that each tail probability is strictly less than unity, $$P( X_i > y) < 1$$ the limit of their product is zero $$\lim_{n \to \infty} P( Y > y) = \lim_{n \to \infty} \prod_i^n P( X_i > y) = 0.$$ That is, the distribution function converges towards the Heaviside step function

$$\lim_{n \to \infty} F_Y(y) = \lim_{n \to \infty} \bigl( 1-P( Y > y) \bigr) = H(y - y_0)$$ where $y_0 \equiv \sup\{y: P(X_1 > y) = 1 \}$ is where the "step" takes place.

We know that $y_0 \geq 0$ exists since it is given that $P(X_1 \geq 0) = 1$.

The variance of a Heaviside distribution (Dirac delta density) is zero, as we needed.

1
On

I can prove the variance tends to zero, but don't see how to show that the variance decreases.

Writing $Y_n:=\min(X_1,\ldots,X_n)$, use the method of @Lee David Chung Lin to argue that the $Y_n$ converge in distribution to the essential infimum of $X$, which we can call $m$. Since the limit is constant, we also have $Y_n$ converges in probability to $m$.

Convergence of $Y_n$ in probability to a constant $m\ge0$ is not enough to conclude that $\operatorname{Var}(Y_n)\to0$. The key observation is that the $\{Y_n\}$ are a decreasing sequence: $Y_{n+1}\le Y_n$ for all $n$. This means that we can conclude almost sure convergence to $m$. Then deduce, by monotone convergence, that $E(Y_n)\to m$ and $E(Y_n^2)\to m^2$, and therefore $\operatorname{Var}(Y_n)\to0$.