Maximum of a Gaussian random vector

83 Views Asked by At

Suppose that $X\sim N(0,\Sigma)$ and $Y\sim N(0,\Omega)$ are independent random vectors in $\mathbb{R}^d$ ($\Sigma$ and $\Omega$ are positive-definite). It is known that $$ \mathsf{P}(X\in A)\ge \mathsf{P}(X+Y\in A) $$ for any convex set $A\in \mathbb{R}^d$ that is symmetric around $0$. In particular, $$ \mathsf{P}(\max_{i}|X_i|\le t)\ge \mathsf{P}(\max_{i}|X_i+Y_i|\le t) $$ for any $t\ge 0$. Do we have a similar relation for maxima of these vectors, i.e. $$ \mathsf{P}(\max_{i}\{X_i\}\le t)\ge \mathsf{P}(\max_{i}\{X_i+Y_i\}\le t),\quad t\ge 0? $$