A Zero-Inflated Poisson r.v. $X$ with parameter $p$ and $\lambda$ can be generated as follows. First flip a coin with probability of $p$ of Heads. Given that the coin lands Heads, $X = 0$. Given that the coin lands Tails, $X$ is distributed Pois($\lambda$). Note that if $X = 0$ occurs, there are two possible explanations: the coin could have landed Heads (in which case the zero is called a structural zero), or the coin could have landed Tails but the Poisson r.v. turned out to be zero anyway. For example, if $X$ is the number of chicken sandwiches consumed by a random person in a week, then $X = 0$ for vegetarians (this is a structural zero), but a chicken-eater could still have $X = 0$ by chance (since they might not happen to eat any chicken sandwiches that week).
a) Find the PMF of a zero-inflated Poisson r.v. $X$.
b) Find the mean and variance of $X$.
Hint: Use the following facts from Calculus: $$\mathbb\sum_{k=0}^{\infty}\frac{\lambda^k}{k!}=e^\lambda\,,$$ $$\mathbb\sum_{k=0}^{\infty}\frac{(k+1)\lambda^k}{k!}=e^\lambda+\lambda e^\lambda$$
For part A I got, $$P(X = 0) = p + (1 - p)\frac{\lambda^0}{0!}e^{-\lambda} = p + (1 - p)e^{-\lambda}$$ then for $k$ greater than or equal to $1$, $$P(X = k) = k(1 - p)\frac{\lambda^k}{k!}e^{-\lambda}$$
For part B I got, $$E(X^2) = E((1 - I)^2Y^2) = E((1 - I)^2)E(Y^2) = (1 - p)(\lambda + \lambda^2)$$ then $$E(Y^2) = Var(Y) + E(Y)^2 = (\lambda + \lambda^2)$$ and finally, $$Var(X) = E(X^2) - E(X)^2 = (1 - p)(\lambda + \lambda^2) - (1 - p)^2\lambda^2$$
My answers seem off to me but I am not exactly sure where I am off. Any help would be much appreciated.
The answer you obtained for Part A is almost correct; you have an extra factor of $k$ when $X \ne 0$. To see why, another way to write the zero-inflated Poisson model is to let $X = (1-B) N$, where $B \sim \operatorname{Bernoulli}(p)$, and $N \sim \operatorname{Poisson}(\lambda)$. Then $$\Pr[X = x] = \Pr[X = x \mid B = 0]\Pr[B = 0] + \Pr[X = x \mid B = 1]\Pr[B = 1]$$ by the law of total probability. This simplifies to $$\Pr[X = x] = \Pr[N = x](1-p) + \Pr[0 = x]p.$$ Then we have $$\Pr[X = x] = e^{-\lambda} \frac{\lambda^x}{x!} (1-p) + \mathbb 1(x=0)p$$ where $$\mathbb 1(x=0) = \begin{cases}1, & x = 0 \\ 0, & x \ne 0. \end{cases}$$ Writing this out explicitly, $$\Pr[X = x] = \begin{cases} e^{-\lambda} (1-p) + p, & x = 0 \\ e^{-\lambda}\frac{\lambda^x}{x!} (1-p), & x \ne 0. \end{cases}$$ When you add up the probability of all possible values of $X$, you should get $1$: $$\begin{align} \sum_{x=0}^\infty \Pr[X = x] &= \Pr[X = 0] + \sum_{x=1}^\infty \Pr[X = x] \\ &= p + (1-p) \left( e^{-\lambda} + \sum_{x=1}^\infty e^{-\lambda} \frac{\lambda^x}{x!} \right) \\ &= p + (1-p)(1) \\ &= 1. \end{align}$$
For Part B, I would use the laws of total expectation and total variance. In particular: $$\operatorname{E}[X] = \operatorname{E}[\operatorname{E}[(1-B)N \mid B]] = \operatorname{E}[(1-B)\operatorname{E}[N]] = \operatorname{E}[(1-B)\lambda] = \lambda \operatorname{E}[1-B] = \lambda(1-p),$$ and $$\begin{align} \operatorname{Var}[X] &= \operatorname{E}[\operatorname{Var}[X \mid B]] + \operatorname{Var}[\operatorname{E}[X \mid B]] \\ &= \operatorname{E}[\operatorname{Var}[(1-B)N \mid B]] + \operatorname{Var}[\operatorname{E}[(1-B)N \mid B]] \\ &= \operatorname{E}[(1-B)^2 \operatorname{Var}[N]] + \operatorname{Var}[(1-B)\operatorname{E}[N]] \\ &= \operatorname{E}[(1-B)^2 \lambda] + \operatorname{Var}[(1-B)\lambda] \\ &= \lambda \operatorname{E}[1-2B+B^2] + \lambda^2 \operatorname{Var}[1-B] \\ &= \lambda (1 - 2p + p) + \lambda^2 \operatorname{Var}[B] \\ &= \lambda (1-p) + \lambda^2 p(1-p) \\ &= \lambda (1-p) (1 + \lambda p). \end{align}$$