Suppose that $X$ and $Y$ are independent $\chi^2(k)$ random variables with $k$ degrees of freedom and their sum $S=X+Y$. I'm having trouble deriving the conditional probability: $$ P(S>s| X\leq c, Y\leq c) $$ where $c$ is some fixed value. By Bayes' rule, I can boil this down to a problem of getting the joint distribution for $X,Y,S$ $$ P(S> s, X\leq c, Y\leq c) $$ The route I was going was: \begin{align} P(S> s, X\leq c, Y\leq c)&=\int_0^c P(X\leq c, S> s|Y=y)f_Y(y)dy\\ &=\int_0^c P(X\leq c, X+y> s|Y=y)f_Y(y)dy\\ &=\int_0^c P(X\leq c, X> s-y|Y=y)f_Y(y)dy\\ &=\int_0^c P(s-y<X\leq c|Y=y)f_Y(y)dy\\ &=\int_0^c P(s-y<X\leq c)f_Y(y)dy\hspace{5mm}\text{by independence of $X,Y$}\\ &=\int_0^c\left(\int_{s-y}^c f_X(x) dx\right)f_Y(y)dy \end{align}
Is this the right track?
The integral of $f_X(x)\, dx \,f_Y(y)\, dy$ looks correct but you should probably consider the limits of integration of the two cases $0 \le s \le c$ and $c \le s \le 2c$ separately and the double integral is unlikely to be simple; meanwhile $s \lt 0$ and $s \gt 2c$ would have conditional probabilities $1$ and $0$.
$0 \le s \le c$ would have $P(S>s\mid X\leq c, Y\leq c)=1-\dfrac{\int_0^s \int_0^{s-y} \cdots}{\int_0^c\int_0^c \cdots }$
while $c \le s \le 2c$ would have $P(S>s\mid X\leq c, Y\leq c)=\dfrac{\int_{s-c}^c \int_{s-y}^c \cdots}{\int_0^c\int_0^c \cdots }$