Inequality of Probability

23 Views Asked by At

This problem is from Matrix Analysis for Statistics by James R. Schott.

Problem. Show that if $x$ and $y$ are two independently distributed random vectors with $x\sim{\rm Normal}(0,\Omega_1)$ and $y\sim{\rm Normal}(0,\Omega_2)$, such that $\Omega_1-\Omega_2$ is positive semi-definite, then $$P(x\in S)\leq P(y\in S),$$ where $S$ is a convex subset of $\mathbb{R}^m$, symmetric about $0$.

In that book, we have a theorem:

Theorem. Let $S$ be a convex subset of $\mathbb{R}^m$, symmetric about $0$, and let $f:\mathbb{R}^m\to\mathbb{R}$ be a non-negative function for which $f(x)=f(-x)$ and $\{f\geq\alpha\}$ is convex for any $\alpha>0$. Suppose $\int_S{f(x)dx}<\infty$. Then $$\int_S{f(x+cy)dx}\geq\int_S{f(x+y)dx}$$ for every $c\in[0,1]$ and $y\in\mathbb{R}^m$.

In particular, if $f$ is a p.d.f. for $x$, then the inequality can be read as $$P(x+cy\in S\mid y)\geq P(x+y\in S\mid y)$$ for every $c\in[0,1]$ and constant $y\in\mathbb{R}^m$. Furthermore, assume that $y$ is a random vector independent of $x$. Then by integrating over all possible $y$, we have $$P(x+cy\in S)\geq P(x+y\in S)$$ for every $c\in[0,1]$.

Besides, it is clear that the p.d.f. of a multivariate normal distribution satisfies the condition in the theorem, so it is reasonable to use the inequality we just obtained to prove the desired result. I tried setting $x:=y$ and $y:=x-y$, but $y$ and $x-y$ are clearly not independent, so this approach fails. Hope anyone has better solutions.