Expectation of normalized order statistics

187 Views Asked by At

Is there a way to calculate quantities of the form $$\mu_k = \mathbf{E}\bigg[ \frac{X_{(k)}}{\sum_{i=1}^n X_i} \bigg]$$ where the $X_i$'s are independent exponentially distributed random variables with mean $\lambda=1$ and $X_{(k)}$ denotes the $k$-th order statistic ?

1

There are 1 best solutions below

1
On BEST ANSWER

Your question is actually interesting. If you consider Sukhatme's spacing, then you might get an answer. Since $X_1, \ldots, X_n$ are independent and identically-distributed random variables from $\mathrm{Exp(1)}$, then Sukhatme's spacing are simply $$ S_1 = n X_{(1)}, \quad S_i = (n - i + 1) \left[X_{(i)} - X_{(i - 1)}\right] $$ for $i = 2, \ldots, n$, such that $S_1, \ldots, S_n$ are independent and identically-distributed random variables from $\mathrm{Exp(1)}$, and $$ \sum^n_{i = 1} S_i = \sum^n_{i = 1} X_{(i)} = \sum^n_{i = 1} X_i. $$

Here is an attempt to answer your question, which might be correct or incorrect. In the case of $k = 1$, define $$ Z_1 = \frac{S_1}{\sum^n_{i = 1} S_i} = \frac{S_1}{S_1 + \sum^n_{i = 2} S_i} $$ Since $S_1$ follows $\mathrm{Gamma}(1, 1)$ and $\sum^n_{i = 2} S_i$ follows $\mathrm{Gamma}(n - 1, 1)$, then $Z_1$ follows $\mathrm{Beta}(1, n - 1)$. Therefore, $E(Z_1) = n^{-1}$; hence, $$ \frac{1}{n} = E(Z_1) = E\left(\frac{S_1}{\sum^n_{i = 1} S_i}\right) = n E\left(\frac{X_{(1)}}{\sum^n_{i = 1} X_i}\right) \Rightarrow E\left(\frac{X_{(1)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n^2} $$ In the case of $k > 1$, define $$ Z_k = \frac{S_k}{\sum^n_{i = 1} S_i} = \frac{S_k}{S_k + \sum^n_{i = 1, i \neq k} S_i} $$ Clearly, $E(Z_k) = n^{-1}$. Accordingly, $$ n^{-1} = E(Z_k) = E\left(\frac{(n - k + 1) \left[X_{(k)} - X_{(k - 1)}\right]}{\sum^n_{i = 1} X_i}\right) \Rightarrow E\left(\frac{X_{(k)}}{\sum^n_{i = 1} X_i}\right) - E\left(\frac{X_{(k - 1)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n (n - k + 1)} $$ Consequently, one can write the following equations $$ \begin{array}{l} E\left(\frac{X_{(k)}}{\sum^n_{i = 1} X_i}\right) - E\left(\frac{X_{(k - 1)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n (n - k + 1)} \\ E\left(\frac{X_{(k - 1)}}{\sum^n_{i = 1} X_i}\right) - E\left(\frac{X_{(k - 2)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n (n - (k - 1) + 1)} \\ \vdots \\ E\left(\frac{X_{(2)}}{\sum^n_{i = 1} X_i}\right) - E\left(\frac{X_{(1)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n (n - 1)} \\ E\left(\frac{X_{(1)}}{\sum^n_{i = 1} X_i}\right) = \frac{1}{n^2} \\ \end{array} $$ By summing the above expressions, one gets $$ E\left(\frac{X_{(k)}}{\sum^n_{i = 1} X_i}\right) = \sum^{k}_{l = 1} \frac{1}{n (n - l + 1)} $$

References: P. V. Sukhatme (1937), Tests of significance for samples of the $\chi^2$ population with two degrees of freedom, Ann. Eugenics 8, 52-56.