I'm working on a problem from Casella and Berger's Statistical Inference. X is distributed as Poisson$(\theta)$ and Y is distributed as Poisson$(\lambda)$, with X and Y being independent. We let U = X + Y and V = Y, and the conditional pdf of V|U is:
$\hspace{15mm}\large f(v|u) = \large \frac{f(u,v)}{f(u)} = \huge \frac{\frac{\theta^{u-v}}{(u-v!)}*\frac{\lambda^ve^{-\lambda}}{v!}}{\frac{(\theta+\lambda)^ue^{-(\theta+\lambda)}}{u!}}$
Apparently this simplifies to
$\hspace{15mm}\Large {u \choose v} (\frac{\lambda}{\theta+\lambda})^v(\frac{\theta}{\theta+\lambda})^{u-v}$
I don't see how the $\Large(\frac{\lambda}{\theta+\lambda})^v$ is possible. I keep ending up with $\Large(\frac{\lambda}{\theta})^v$ when I simplify the problem.
It is quite likely that you were told that $X$ and $Y$ are independent random variables, but neglected to pass on this information to us.
Assuming that $X$ and $Y$ are independent Poisson random variables, $U = X+Y$ is also a Poisson random variable with parameter $\theta+\lambda$. Thus, the probability that $U = m$ ($m$ is a nonnegative integer here) is $$p_{X+Y}(m) = \frac{e^{-(\theta+\lambda)}(\theta+\lambda)^m}{m!}.$$ Conditioned on $X+Y = m$, the conditional probability that $Y = n$ is $0$ if $n >m$, while for $0 \leq n \leq m$, $$\begin{align}p_{Y\mid X+Y=m}(n\mid m) &= \frac{P\left(\{Y=n\} \cap \{X+Y=m\}\right)}{P\{X+Y=m\}} &{\scriptstyle{\text{definition of comditional probability}}}\\ &= \frac{P\{X=m-n,Y=n\}}{P\{X+Y=m\}} &{\scriptstyle{\text{a re-write}}}\\ &= \frac{P\{X=m-n\}P\{Y=n\}}{\frac{e^{-(\theta+\lambda)}(\theta+\lambda)^m}{m!}} &{\scriptstyle{\text{independence of}~X~\text{and}~Y}}\\ &= \frac{\frac{e^{-(\theta)}(\theta)^{m-n}}{(m-n)!} \frac{e^{-(\lambda)}(\lambda)^{n}}{(n!}}{\frac{e^{-(\theta+\lambda)}(\theta+\lambda)^m}{m!}}\\ &= \binom{m}{n}\left(\frac{\theta}{\theta+\lambda}\right)^{m-n} \left(\frac{\lambda}{\theta+\lambda}\right)^{n} \end{align}$$ which shows that, conditioned on the value of $X+Y$, $Y$ is a binomial random variable.