Mixed moments of minima of i.i.d. exponentials.

94 Views Asked by At

Let $T_1,T_2,T_3,\ldots$ be i.i.d. $\exp(1)$ random variables (i.e., exponentials with parameter $1$).


By doing explicit computations, it can be shown that $$E[\min\{T_1,T_2\}\min\{T_2,T_3\}]=\frac13>\frac14=E[\min\{T_1,T_2\}\min\{T_3,T_4\}],$$ and that \begin{align} &E[\min\{T_1,T_2,T_3\}\min\{T_2,T_3,T_4\}]=1/6\\ >&E[\min\{T_1,T_2,T_3\}\min\{T_3,T_4,T_5\}]=2/15\\ >&E[\min\{T_1,T_2,T_3\}\min\{T_4,T_5,T_6\}]=1/9. \end{align}


Similar computations suggest that the same inequalities hold for larger moments, i.e., $$E[\min\{T_1,T_2\}^m\min\{T_2,T_3\}^n]>E[\min\{T_1,T_2\}^m\min\{T_3,T_4\}^n],\qquad m,n\in\mathbb N$$


Based on these computations, the general principle seems to be that when looking at mixed moments in these minima of independent exponentials (which are themselves exponentials), the more $T_i$'s they have in common, the higher the moments will be.

However, I can't find a conceptual argument (if there even is one) to prove something like this that would be more illuminating than direct computations.


Edit: here's a formal statement of what I think is true:

Let $p\in\mathbb N$, and consider the sets of indices \begin{align} I&=\{i_1,i_2,\ldots,i_p\}\qquad J=\{j_1,j_2,\ldots,j_p\}\\ K&=\{k_1,k_2,\ldots,k_p\}\qquad L=\{\ell_1,\ell_2,\ldots,\ell_p\}. \end{align} If $|I\cap J|>|K\cap L|$, then $$E[\min\{T_i:i\in I\}^m\min\{T_j:j\in J\}^n]>E[\min\{T_k:k\in K\}^m\min\{T_\ell:\ell\in L\}^n]$$ for all $m,n\in\mathbb N$.

1

There are 1 best solutions below

0
On BEST ANSWER

The conjecture is true, even in a stronger "conceptual" form.

Let $(X_n,n\ge 1)$ be iid random variables, $I,J,K,L\subset \mathbb{N}$ be such that $|I|=|K|=n, |J|=|L|=m$, $|K\cap L|<|I\cap J|$. Let also $f\colon \mathbb R^{n}\to \mathbb R$, $g\colon \mathbb R^{m}\to \mathbb R$ be symmetric measurable functions, non-decreasing in all variables. Then $$ \mathrm{E} [f(X_i,i\in K)g(X_j,j\in L)]\le \mathrm{E} [f(X_i,i\in I)g(X_j,j\in J)] $$ with the equality possible only in the case where $f$ or $g$ is constant on the support of $X$.

Sketch Assume without loss of generality that $\{1,\dots,l\}=K\cap L \subset I\cap J = \{1,\dots,k\}$. By induction, it is enough to prove the claim when $l=k-1$. WLOG $k\in K$, $k+1\in L$.

Fix the first $k-1$ variables and integrate over all variables with indices greater than $k+1$. Then the inequality is reduced to: for any $a\in \mathbb{R}^k$, $$ \mathrm{E} [f_1(a,X_k)g_1(a,X_{k+1})] = \mathrm{E} [f_1(a,X_k)]\mathrm{E} [g_1(a,X_{k+1})] \le \mathrm{E}[f_1(a,X_k)g_1(a,X_k)],\tag{1} $$ where for $a\in\mathbb{R}^{k-1},x\in\mathbb{R}$ $$ f_1(a,x) = \mathrm{E}[f(a,x,X_1,\dots,X_{n-k})], g_1(a,x) = \mathrm{E}[g(a,x,X_1,\dots,X_{m-k})]. $$ The inequality (1) is well known, see here. Substituting $a = (X_1,\dots,X_{k-1})$ and taking the expectation, we get the required inequality.

The statement about the equality follows from a similar statement for the inequality under the link.