Proving a theorem Involving the minimum element of a multivariate Gaussian distribution

19 Views Asked by At

I want to prove a relatively simple theorem, but I don't know where to start.

Consider a multivariate Gaussian:

$X \sim \mathcal{N}(\boldsymbol{\mu}, \sigma \boldsymbol{I})$

Where $I$ is the identity matrix, $\sigma$ is a constant, the mean $\boldsymbol{\mu}$ is a $d \times 1$ vector of ordered numbers (so that $\mu_n < \mu_m$ iff $n < m$).

Each sample from this distribution returns a $d$-dimensional vector $X^{(i)}$. The index of the minimum element in that sample is a random variable $A^{(i)} \in \{1,2,...,d\}$.

Consider experiments involving different values of $\boldsymbol{\mu}$. In particular, consider $\boldsymbol{\mu_1}$ and $\boldsymbol{\mu_2}$, which were carefully selected to meet two constraints:

Constraint 1: $P(A = 1 | \boldsymbol{\mu_1}, \sigma) > P(A = 1 | \boldsymbol{\mu_2}, \sigma)$

Constraint 2: $\exists a \in \{2,...,d\} \text{ s.t., } P(A = a | \boldsymbol{\mu_1}, \sigma) = P(A = a | \boldsymbol{\mu_2}, \sigma)$

Given these constraints, I want to prove the following statement:

$$\mathbb{E}_{X \sim \mathcal{N}(\boldsymbol{\mu_1})}[X_a| A =a] < \mathbb{E}_{Y \sim \mathcal{N}(\boldsymbol{\mu_2})}[Y_a|A =a]$$

Where $X_a, Y_a$ denote the $a$'th element of $X$ and $Y$ respectively.

I don't know where to start, and I would appreciate some direction.