Suppose $X,Y$ are i.i.d $N(0,1)$ random variables. I want to find the density of $X$ conditioned on the event that $X+Y > 0.$
My issue is that this is conditioning over an event instead of a random variable. Let $Z = X+Y,$ I know that $Z\sim N(0,2).$ I know how to work out $p(x|z) = \frac{p(x,z)}{p(z)},$ since I could work out $p(x,z)$ with the transformation of variables formula.
But here we have $p(x | Z > 0).$ Is it true that
$$ p(x | Z>0) = \int^{\infty}_0 p(x|z) p(z) dz $$
and if so, why? If not, what is the correct way to compute this conditional density?
Here are two approaches I thought of, but I'm interested in the answer to my previous question and/or other (hopefully easier or more direct) approaches.
To compute $p(x|X+Y>0)$ we can compute $\mathbb{P}(X\leq t | X+Y > 0)$ first and then differentiate w.r.t $t$. This is equal to the integral of $p(x,y)$ over the region satisfying $x \leq t, x+y > 0$ which is some infinite wedge. We could them carefully set up the bounds on the double integral to compute this, but there should be an easier approach.
Another way, similar to the first, let $Z= X+Y$ and it's tedious but not hard to compute $p(x,z).$ Again, we first compute $\mathbb{P}(X \leq t | Z > 0)$ so we can differentiate w.r.t $t$ and get our result. This is equal to the integral of $p(x,z)$ over $z > 0, x < t$ which is now an easier double integral.
Does anyone have a slick approach?
I'd suggest to apply Bayes Rule in the form $$f_{X|X+Y>0}(x)=\frac{P(X+Y>0|X=x)f_X(x)}{P(X+Y>0)}$$ and use independence of $X$ and $Y$.
The independence is used as follows: $$P(X+Y>0|X=x)=P(x+Y>0)$$