Let $X_1, X_2$ be i.i.d. distributed random variables with densities
\begin{align*} f_{X_1}(x_1) &= e^{-x_1} \\ f_{X_2}(x_2) &= e^{-x_2} \end{align*}
Derive the conditional density of $X_1$ given $X_1 + X_2 = y$.
Here's my attempt at a solution:
\begin{align*} f_{(X_1 | X_1 + X_2)}(x|y) &= \frac{f_{(X_1 , X_1 + X_2)}(x,y)}{f_{X_1+X_2}(y)} \\ &= \frac{f_{X_1}(x) f_{X_2}(y-x)}{f_{X_1+X_2}(y)} \\ &= \frac{f_{X_1}(x) f_{X_2}(y-x)}{\int_0^yf_{X_1}(x) f_{X_2}(y-x) dx} \\ &= \frac{e^{-x} e^{-y+x}}{\int_0^y e^{-x} e^{-y+x} dx} \\ &= \frac{1}{\int_0^y dx} \\ &= \frac{1}{y} , y>0\\ \end{align*}
Is this correct?
HINT
It can't be what you say at the end since that is not normalized. Your work is mostly right, but you missed a restriction on $x.$