I applied the Bayes rule as follows:
$$ P\left(\sum_{i=1}^N X_i>y|x<\sum_{i=1}^{N-1} X_i<y\right)=\frac{P\left(x<\sum_{i=1}^{N-1} X_i<y|\sum_{i=1}^N X_i>y\right)P\left(\sum_{i=1}^N X_i>y\right)}{P\left(x<\sum_{i=1}^{N-1} X_i<y\right)} $$
Here $X$ is a random variable, $x<0$, and $y>0$ are some real numbers. My intention was to make the calculation of the left hand side easier because in its current form I need to make a convolution of the density of $\sum_{i=1}^{N-1} X_i$ with $X$ in the range $[x,y]$.
The two terms at the left hand side namely $P(\sum_{i=1}^N X_i>y)$ and, $P(x<\sum_{i=1}^{N-1} X_i<y)$ are easy to handle because the convolution is the regular one with the limits at $[-\infty,\infty]$.
I was thinking that the third term $P(x<\sum_{i=1}^{N-1} X_i<y|\sum_{i=1}^N X_i>y)$ would be equal to $1$. I was thinking that if the summation of $N$ terms was known to be larger than $y$, the sum of $N-1$ terms had to be in $[x,y]$, because otherwise the process would be terminated at the $(N-1)$th stage. I was wrong because the formulation doesnt have anything to do the with the procedure of terminating the process.
My question: how can we calculate $P(x<\sum_{i=1}^{N-1} X_i<y|\sum_{i=1}^N X_i>y)$ in a simple way? is the right hand side (after applying Bayes to the left side) easier to calculate after all?
Thanks!
So... one considers two independent random variables $U$ and $V$ and the idea to compute $$(\ast)=P[U+V\gt y\mid x\lt U\lt y] $$ would be to see whether $P[x\lt U\lt y\mid U+V\gt y]$ would not be simpler. I seriously doubt it is.
To compute $(\ast)$ when $U$ is densitable, consider the PDF $f_U$ of $U$, the CDF $F_U$ of $U$ and the complementary CDF $\bar F_V$ of $V$, then $$ (\ast)=\frac{\displaystyle\int_x^y\bar F_V(y-u)f_U(u)\mathrm du}{\displaystyle\int_x^yf_U(u)\mathrm du}=\frac1{F_U(y)-F_U(x)}\int_x^y\bar F_V(y-u)f_U(u)\mathrm du. $$