My answer: P(Y=l | X=k)= P(Y=l and X=k)/P(X=k)= [(kCl)P^k(1-p)^k-1]/[e^(-λ).(λ^k /k!)]
Joint mass function: P(X=k, Y=l)= (kCl)p^k(1-p)^(k-l)
Support : If X=k and Y=l then Supp(X)= k and Supp(Y)=l where K>=l.
I am not sure how to use the $X$ distribution when it's poisson. Intuitively I don't understand it. This is how I interpret it: $P(X=k)$ is the probability that there will be $k$ tosses. But what does this mean? Am I understanding this wrong? Also, I have written the answer to the first 3 parts of a). I'm not quite sure whether the conditional probability is the first of the second case(which for me, intuitively, the second is more logical(the one in the box). The PMF seems to be also intuitive in my answer, although again unsure. The support should be right. Could you please explain how to do the a) and b)?


You seem to be trying to find the conditional distribution of $Y$ given $X$ by using the definition of conditional probability. But you are given the conditional distribution of $Y$ given $X.$ It is this: $$ Y\mid (X=k) \sim \operatorname{Binomial}(k,p). $$ So then you get the joint probability mass function: \begin{align} \Pr(Y=\ell\ \&\ X=k) = {} & \Pr(X=k)\Pr(Y=\ell\mid X = k) \\[8pt] = {} & \frac{\lambda^k e^{-\lambda}}{k!}\cdot \binom k \ell p^\ell(1-p)^{k-\ell} \\[8pt] & \underbrace{\text{ for } 0 \le\ell \le k.}_\text{This is the support.} \end{align}
The marginal distribution of $Y$ is the part that has appeared in similar questions here before. You have \begin{align} & \Pr(Y=\ell) \\[8pt] = {} & \Pr(Y=\ell\ \&\ X=\ell) \\ & {} + \Pr(Y=\ell\ \&\ X= \ell+1) \\ & {} + \Pr(Y=\ell\ \&\ X=\ell+2) \\ & {} + \Pr(Y=\ell\ \&\ X= \ell+3) + \cdots \\[8pt] = {} & \sum_{k=\ell}^\infty \Pr(Y=\ell\ \&\ X = k) \\[8pt] = {} & \sum_{k=\ell}^\infty \frac{\lambda^k e^{-\lambda}}{k!} \cdot \binom k \ell p^\ell (1-p)^{k-\ell} \\[8pt] = {} & \sum_{k=\ell}^\infty \frac{\lambda^k e^{-\lambda}}{k!} \cdot \frac{k!}{\ell!(k-\ell)!} p^\ell (1-p)^{k-\ell} \\[8pt] & \text{and $k!$ cancels from the numerator and denominator:} \\[8pt] = {} & \sum_{k=\ell}^\infty \frac{\lambda^k e^{-\lambda}}{\ell!(k-\ell)!} \cdot p^\ell (1-p)^{k-\ell}. \end{align} Now observe that as $k$ goes from $\ell$ to $\infty$ the factor $e^{-\lambda}/\ell!$ does not change, so we can pull it out: $$ \frac{e^{-\lambda}}{\ell!} \sum_{k=\ell}^\infty \frac{\lambda^k}{(k-\ell)!} \cdot p^\ell(1-p)^{k-\ell}. $$ As $k$ goes from $\ell$ to $\infty,$ the expression $k-\ell$ appearing in the sum goes from $0$ to $\infty.$ Thus we can let $j=k-\ell$ so that $\displaystyle \sum_{k=\ell}^\infty$ is replaced by $\displaystyle \sum_{j=0}^\infty:$ $$ \frac{e^{-\lambda}}{\ell!} \sum_{j=0}^\infty \frac{\lambda^{j+\ell}}{j!} \cdot p^\ell(1-p)^j. $$ Now as $j$ goes from $0$ to $\infty,$ $\ell$ does not change. Thus the expression above becomes \begin{align} & \frac{e^{-\lambda}}{\ell!}\cdot\lambda^\ell p^\ell \sum_{j=0}^\infty \frac{\lambda^j}{j!} \cdot (1-p)^j \\[12pt] = {} & \frac{e^{-\lambda} (\lambda p)^\ell}{\ell!} \cdot \sum_{j=0}^\infty \frac{(\lambda(1-p))^j}{j!} \\[8pt] = {} & \frac{e^{-\lambda} (\lambda p)^\ell}{\ell!} \cdot e^{\lambda(1-p)} \\[10pt] = {} & \frac{e^{-\lambda p} (\lambda p)^\ell}{\ell!}. \end{align} Therefore $Y\sim\operatorname{Poisson}(\lambda p).$
The distribution of $Z=X-Y$ is found the same way: Just notice that $Z\mid (X=k)\sim\operatorname{Binomial}(k,1-p)$ and go on from there.
After that, the only thing left to finding the joint distribution of $Z$ and $Y$ is showing that $Z$ and $Y$ are actually independent.