Entropy of Order Statistic

187 Views Asked by At

Consider $n$ independent and identically distributed random variables $ \{X_i\}_{i=1,...n} $ with support on some interval $[a,b]$ and its $n$'th order statistic $\max_{i \in \{1,...n\}} X_i$ . The following "entropy-looking" measure of dispersion of the maximum is

$$ - \int_a^b F^n(x) \ln F^n(x) dx ,$$ where $F(x)= \Pr (X \le x) $. It seems natural that the "entropy" should be decreasing in $n$ (just think about $n$ very large). Is this a known result?

I did in fact prove that the entropy is monotone, but the proof turned out to be lengthy and messy. I would expect that there is a simple argument. Does anyone know?

1

There are 1 best solutions below

0
On BEST ANSWER

No, the entropy is not monotone. For example, consider $F_X(x) = x^{1/N}$ on $[0,1]$. Then $\max(X_1,\ldots,X_N)$ is uniform on $[0,1]$. The entropy of $\max(X_1,\ldots,X_n)$ increases as a function of $n$ for $1 \le n \le N$, reaching $0$ at $n=N$, then decreases after that.