$X$ and $Y$ are independent random variables with distribution Exp(1).
Let us define: $U=e^X\,,\, V=X+Y$. Calculate $E(U\mid\,V=1)$
I tried going by the definition and find the densities,but I always get a non-converging integral... thanks in advance.
Edit: the answer by the book is $e-1$.
$$ \begin{align} \mathbb{E}[U|V](v) &= \mathbb{E}[e^x|V](v)\\ &= \int_0^\infty e^x f_{X|V=v}(x)\,dx\\ &= \int_0^\infty e^x \frac{f_{V|X}(v) f_X(x)}{f_V(v)}\,dx \end{align} $$
Note that $V\sim \text{Gamma}(2,1)$, so $f_V(v) = ve^{-v}$. Also (I'll let you derive this part) $$ P(V\le v | X=x) = (1-e^{-(v-x)})\boldsymbol{1}(v\ge x). $$ Therefore, by taking the derivative with respect to $v$, $$ f_{V|X}(v) = e^{-(v-x)}\boldsymbol{1}(v\ge x). $$ Substitute into the conditional expectation expression to get $$ \begin{align} \mathbb{E}[U|V](v) &= \int_0^\infty e^x \frac{1}{v} \boldsymbol{1}(v\ge x)\,dx\\ &= \frac{e^v-1}{v} \end{align} $$
Therefore $\mathbb{E}[U|V=1] = \frac{e^1-1}{1} = e-1$, which matches your answer.
Technically the above only holds for nonnegative $v$, but we don't really need to worry about that, just wanted to mention it.