Minimum of n iid chi-squared with k degrees of freedom

76 Views Asked by At

I am currently trying to compute the expectation (a nice closed form distribution would be even better) of the minimum of n iid chi-squared with k degrees of freedom.

After struggling with the pdf easily derived from the definition, I tried some numerical experiments and by making log-log plots, I have a strong belief (R2 of the regression > 0.998) that the solution is: $$\mathbb{E}\left[\min_{i \in [n]}X_i\right] = kn^{-\frac{1}{\sqrt{k}}}$$ Where $X_i \sim \chi^2(k)$ are iid.

I don't find any way to prove it yet and would be grateful of any help or reference (I didn't find any either).

3

There are 3 best solutions below

1
On

Hi: The probability of the minimum being less than $X$ equals 1 minus the probability that all of the $n$ chi-squared with $k$ df random variables are greater than $X$. The probability that all of the $n$ chi-squared rvs are greater than $X$ is equal to $(1-F(x))^n$ where $1-F(X)$ is the probability that one chi-squared rv is greater than X. So, you can look up the probability of one chi-squared with $k$ df being greater than $X$ (i.e: 1-F(X). any statistics text should have F(X) tables in the back of it or use R ) and then calculate : $(1-(1-F(X))^n)$. Hopefully you will get something close to what you got numerically.

0
On

$kn^{-\frac{1}{\sqrt{k}}}$ does not look correct for the expectation of the minimum.

As an easy example, with $k=2$ degrees of freedom, each chi-squared distribution has an $\rm{Exp}(\frac12)$ distribution.

So their minimum has an $\rm{Exp}(\frac n2)$ distribution with expectation $2n^{-1}$ rather than $2n^{-\frac{1}{\sqrt{2}}}$.

1
On

You can use the Fisher-Tippett-Gnedenko theorem. Which states that

$$\mathbb E\left[\min_i X_i\right]\sim_{n\to\infty} \frac1{n(n+1)}\sum_{k=1}^n \mathbb E\left[X_k\right] = \frac{k}{n}$$ in your case.