Probability of conditional tail of a sum of Gaussians

90 Views Asked by At

Let $Z\sim\mathcal{N}(\mathbf{0}, I_n)$ be a standard Gaussian vector. Define $X=Z\cdot \mathbf{u}$ and $Y=Z\cdot \mathbf{v}$ where $\mathbf{u}, \mathbf{v}$ are unit vectors such that $\mathbf{u}\cdot\mathbf{v}=0$. Then we have that $$ X,Y\sim \mathcal{N}(0, 1),\quad X+Y,X-Y\sim \mathcal{N}(0, 2). $$ However, $X$ and $Y$ are clearly not independent.

I am now trying to find the probability of the event $|X|-|Y| > C$ for any $C\in\mathbb{R}$. My approach so far is the following. $$ \newcommand{\Prob}{\mathbb{P}} \begin{align} \Prob(|X| - |Y| > C) &= \frac14\big(\Prob(X-Y>C\mid X,Y>0) + \Prob(X+Y>C\mid X>0,Y<0) +\\ &\qquad\;\Prob(-X-Y>C\mid X<0,Y>0) + \Prob(-X+Y>C\mid X,Y<0)\big). \end{align} $$ However, I am not sure how to calculate each of these 4 probabilities. I'd imagine that we might obtain something in terms of the Normal CDF, I mean an expression in terms of $\Phi(C)$, but I don't see it from here. Any hints or pointers would help. Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

We have $$\begin{align}E[XY]&= E\left[\sum_{i,j}u_iZ_iv_jZ_j\right]\\ &=E\left[\sum_{i}u_iv_iZ_i^2 + \sum_{i\ne j}u_iv_jZ_iZ_j\right]\\ &=\sum_{i}u_iv_iE[Z_i^2]\\ &=\sum_iu_iv_i\\&=\mathbf{u}\cdot\mathbf{v}\\ &=0,\end{align}$$ so $X$ and $Y$ are uncorrelated (since also $E[X]=E[Y]=0$); hence, because their distribution is bivariate normal, $X$ and $Y$ are independent.

Therefore, using the symmetry of the distributions of $X$ and $Y$, we obtain $$\begin{align} P(|X|-|Y|>c) &=P(X-Y>c\quad\cap\quad X>0\quad\cap\quad Y>0)\\[1ex] &+P(X+Y>c\quad\cap\quad X>0\quad\cap\quad Y<0)\\[1ex] &+P(-X-Y>c\quad\cap\quad X<0\quad\cap\quad Y>0)\\[1ex] &+P(-X+Y>c\quad\cap\quad X<0\quad\cap\quad Y<0)\\[2ex] &=P(X-Y>c\quad\cap\quad X>0\quad\cap\quad Y>0)\\[1ex] &+P(X-Y'>c\quad\cap\quad X>0\quad\cap\quad Y'>0)\quad\text{letting $Y'=-Y$}\\[1ex] &+P(X'-Y>c\quad\cap\quad X'>0\quad\cap\quad Y>0)\quad\text{letting $X'=-X$}\\[1ex] &+P(X'-Y'>c\quad\cap\quad X'>0\quad\cap\quad Y'>0)\\[2ex] &=4\,P(X-Y>c\quad\cap\quad X>0\quad\cap\quad Y>0)\\[2ex] &=4\,\int_{-\infty}^\infty P(x-Y>c\quad\cap\quad x>0\quad\cap\quad Y>0\mid X=x)\phi(x)\,dx\\[2ex] &=4\,\int_0^\infty P(0<Y<x-c)\,\phi(x)\,dx\\[2ex] &=\begin{cases}4\int_0^\infty\left[\Phi(x-c) - 1/2 \right]\,\phi(x)\,dx&\text{if $c<0$}\\[1ex] 4\int_c^\infty\left[\Phi(x-c) - 1/2 \right]\,\phi(x)\,dx&\text{if $c\ge 0$}\end{cases}\\[2ex] &=\begin{cases}4\int_0^\infty\Phi(x-c)\,\phi(x)\,dx-1&\text{if $c<0$}\\[1ex] 4\int_c^\infty\Phi(x-c)\,\phi(x)\,dx-2\left[1-\Phi(c)\right]&\text{if $c\ge 0$}\end{cases}\\[2ex] &=\begin{cases}4\,\Phi(h)-2\Phi(h)^2 - 1&\text{if $c<0$}\\[1ex] 4\Phi(h)-4\,\text{BvN}[c,h;\rho=-{1\over\sqrt{2}}]-2\,\left[1-\Phi(c)\right]&\text{if $c\ge 0$}\end{cases}\\[2ex] \end{align}$$

where $h=-{c\over\sqrt{2}},$ and $\phi,\Phi,\text{BvN}$ are, respectively, the standard normal PDF, CDF, and the following bivariate normal CDF:

$$\text{BvN}[x',y';\rho]={1\over 2\pi\sqrt{1-\rho^2}}\int_{-\infty}^{y'}\int_{-\infty}^{x'}\exp\left[-\left({x^2-2\rho x y+y^2\over 2(1-\rho^2)} \right) \right]\,dx\,dy.$$

In the final step of the above development we have used the following formulas$^\dagger$: $$\int_0^\infty\Phi(x-c)\,\phi(x)\,dx=\Phi(h)-{1\over 2}\Phi(h)^2 $$ and $$\int_c^\infty\Phi(x-c)\,\phi(x)\,dx=\Phi(h)-\text{BvN}\left[c,h;\rho=-{1\over\sqrt{2}}\right]. $$ $^\dagger$ Found from formulas 10.010.4, 10.010.6 (p.403), 2.3 (p.414), and 10.010.1 (p.402) in the following reference: Owen, D. (1980). "A table of normal integrals". Communications in Statistics: Simulation and Computation. B9 (4): 389–419.