I am struggling with this simple problem.
I have two Gaussian independent random variables $X \sim \mathcal{N}(0,\sigma_x^2,), Y \sim \mathcal{N}(0,\sigma_y^2,)$. I have to find the density of $X$ given that $X + Y > 0$.
I know that $X, X+Y$ shall be jointly normal distributed and I also know the forms of conditional distribution of $X | X+Y=z$
https://en.wikipedia.org/wiki/Multivariate_normal_distribution
https://stats.stackexchange.com/questions/17463/signal-extraction-problem-conditional-expectation-of-one-item-in-sum-of-indepen
but somehow I am confused because the condition that I have is that $X+Y > 0$ and not of the form $X+Y = z$.
I feel that some integration shall have to be performed but I am not sure how. Any pointers shall be very helpful. It will also help if I can get the conditional mean, variance if not the entire density function.
Thanks
$Z=X+Y$ is normally distributed with mean $0$ and variance $\sigma_x^2+\sigma_y^2$
Conditioned on $Z=z$, you would have $X$ being normally distributed with mean $z\frac{ \sigma_x^2}{\sigma_x^2+\sigma_y^2}$ and variance ${ \frac{\sigma_x^2 \sigma_y^2}{\sigma_x^2 + \sigma_y^2}}$
Conditioned on $Z>0$, $Z$ would have a half-normal distribution with mean $\sqrt{(\sigma_x^2+\sigma_y^2)\frac2\pi}$ and variance $(\sigma_x^2+\sigma_y^2)\left(1-\frac2\pi\right)$, so $X$ would have a mean of $\frac{ \sigma_x^2}{\sqrt{\sigma_x^2+\sigma_y^2}}\sqrt{\frac2\pi}$ and a variance of $\sigma_x^2 { \frac{\sigma_x^2 \left(1-\frac2\pi\right)+\sigma_y^2}{\sigma_x^2 + \sigma_y^2}}$.
But the conditional distribution of $X$ given $X+Y \gt 0$ would not have a standard distribution: it would be right-skewed though could take negative values. For example with $\sigma_x^2=400$ and $\sigma_y^2=100$ it might look something like this, with the mean highlighted: