$P(x>0|x+y>0)\;$ of two independently distributed Gaussian distributions?

57 Views Asked by At

I am given two independently distributed Gaussian distributions, $X$ and $Y$, with unknown means and variances. You randomly choose an $x$ and $y$ from these two distributions respectively. Given that $x+y>0$, what is the probability that $x>0$ in terms of the unknown parameters?

Suppose $X\sim N(\mu_1, \sigma_1^2)$ and $Y\sim N(\mu_2, \sigma_2^2)$.

I know that $X+Y\sim N(\mu_1+\mu_2, \sigma_1^2+\sigma_2^2)$. I considered using Bayes’ theorem.

$$P(x>0|x+y>0)=\frac{P(x>0)P(x+y>0|x>0)}{P(x+y>0)}.$$

Since I know the distributions of $X$ and $X+Y$, I know that the forms of $P(x>0)$ and $P(x+y>0)$ are that of integrating the Gaussian. However, I am unsure of the form of $P(x+y>0|x>0)$. I feel that this is much easier to calculate when observing the distribution of $X+Y$ on a 2D-plane, but I am not exactly sure how to write it in terms of the unknowns. Could someone please shed some insight on this? Thanks!

Edit: this was meant to be a conceptual question I believe, and should be left as a combination of integrals.