Let $X$ be a random variable with values $0$ and $1$.
Let $Y$ be a random variable with values in $\mathbb{N_0}$.
Let $ p \in (0,1)$ and $ P(X=0, Y=n) = p \cdot \frac{e^{-1}}{n!} $ and $ P(X=1, Y=n) = (1-p) \cdot \frac{2^{n}e^{-2}}{n!} $.
a) Calculate the marginal distribution $p_x$ and $p_y$.
b) Calculate $P$($X=0$ | $Y > 0$)
c) Calculate $E(Y)$ and $Var(Y)$. (expected value and variance)
I think that we have to start with $p_x = \sum_y P(X=x,Y=y)= \sum_y P(X=x|Y=y) \cdot P(Y=y) $, because this is formula for marginal distribution? Edit a) and b) are clear now.
for c): So We want to use the law of total expectation. $$ E(Y) = E(E(Y|X))= \sum_xE(Y|X=x) \cdot P(X=x)= E(Y|X=0) \cdot p + E(Y|X=1) \cdot (1-p)= \sum_{n=1} n\cdot P(Y=n|X=0) \cdot p+ \sum_{n=1} n\cdot P(Y=n|X=1)\cdot (1-p) = \sum_{n=1} n\cdot \frac{e^{-1}}{n!} \cdot p +\sum_{n=1} n\cdot \frac{e^{-2}\cdot 2^n}{n!} \cdot(1-p)= e^{-1} \cdot e \cdot p + 2\cdot(1-p)\cdot e^{-2} \cdot \sum_{n=1} \frac{2^{n-1}}{(n-1)!} = p + 2\cdot (1-p) e^{-2} \cdot e^2= p+2 \cdot (1-p)= 2-p $$
First, you can compute $P(X=0)$ using the law of total probability. \begin{align*} P(X=0) &= \sum_{n}P(X=0,Y=n) \\ &= p \sum_{n}{\frac{\exp(-1)}{n!}} = p \end{align*} Next, note that \begin{align*} P(Y=n|X=0) &= \frac{P(X=0,Y=n)}{P(X=0)} \\ &= \frac{\exp(-1)}{n!} \end{align*} That is $Y|X=0 \sim \text{Poisson}(1)$. Also, \begin{align*} P(Y=n|X=1) &= \frac{P(X=1,Y=n)}{P(X=1)} \\ &= \frac{\exp(-2)2^n}{n!} \end{align*} That is, $Y|X=1 \sim \text{Poisson}(2)$.
a) $X \sim \text{Bernoulli}(p)$ and $Y \sim p \cdot \text{Poisson}(1) + (1-p) \cdot \text{Poisson}(2)$.
b) Your equality is incorrect. Instead, you can use Bayes' Theorem: \begin{align*} P(X=0|Y>0) &=\frac{P(X=0)P(Y>0|X=0)}{P(X=0)P(Y>0|X=0)+P(X=1)P(Y>0|X=1)} \end{align*}
c) You can use the conditional distribution of $Y$ to compute $E[Y]$ and $Var[Y]$, that is: \begin{align*} E[Y] &= E[E[Y|X]] & \text{Law of total expectation} \\ Var[Y] &= Var[E[Y|X]] + E[Var[Y|X]] & \text{Law of total variation} \\ \end{align*}