A coin is tossed repeatedly with a probability $p$ of landing on tails. If the result of a toss is head, then the coin is tossed again, otherwise not. For each completed toss, the player gets one scratch card. Let $X$ be the number of received scratch cards. For each card, there is a probability $q$ of winning 1 dollar. Let $Y$ be the amount of money that you've won. Find $E(X),Var(x),E(Y),Var(Y)$.
My attempt:
$X$ follows the geometric distribution with success rate $p$ (where a success is tossing tails). Therefore $E(X)=1/p$ and $Var(X)=(1-p)/p^2$.
For one scratch card, we have $E(Y_1)=q$ and $Var(Y_1)=q-q^2$. The cards can be seen as i.i.d. random variables, which means that $E(\sum_{i=1}^n Y_i)=nq$ and $Var(\sum_{i=1}^n Y_i)=n(q-q^2).$ The number of cards is a random variable and therefore $E(Y), Var(Y)$ depend on that number. If $X=n$, then we have $E(Y)=nq$ and $Var(Y)=n(q-q^2)$.
I'm not sure about the calculations for $Y$, because they depend on the value of $X$. So should I say $E(Y|X=n)=nq$. But then, how do I find $E(Y)$?
Thanks.
Let $X_n\stackrel{\mathrm{i.i.d.}}\sim\mathrm{Ber}(1-p)$, then $X = \inf\{n>0: X_n = 0\}$ has a geometric distribution: $\mathbb P(X=k) = (1-p)^k p$ for $k=0,1,2,\ldots$. We compute the mean of $X$: \begin{align} \mathbb E[X] &= \sum_{k=0}^\infty k\cdot\mathbb P(X=k)\\ &= \sum_{k=0}^\infty k(1-p)^kp\\ &= p(1-p)\sum_{k=0}^\infty (k+1)(1-p)^k\\ &= \frac {p(1-p)}{p^2}\\ &= \frac{1-p}p \end{align} and the second factorial moment: \begin{align} \mathbb E[X(X-1)] &= \sum_{k=1}^\infty k(k-1)\mathbb P(X=k)\\ &= \sum_{k=0}^\infty k(k-1)(1-p)^kp\\ &= p(1-p)^2 \sum_{k=0}^\infty (k+1)(k+2)(1-p)^k\\ &=\frac{2p(1-p)}{p^3}\\ &= \frac{2 (1-p)^2}{p^2}. \end{align} The variance of $X$ is then \begin{align} \operatorname{Var}(X) &= \mathbb E[X(X-1)] + \mathbb E[X] - \mathbb E[X]^2\\ &=\frac{2 (1-p)^2}{p^2}+\frac{1-p}p-\left(\frac{1-p}{p}\right)^2\\ &=\frac{1-p}{p^2}. \end{align} Conditioned on $\{X=n\}$, $Y$ is binomially distributed with parameters $n$ and $q$. So for each nonnegative integer $m$, we have \begin{align} \mathbb P(Y=m) &= \sum_{n=m}^\infty \mathbb P(Y=m\mid X=n)\mathbb P(X=n)\\ &= \sum_{n=m}^\infty \binom nm q^m(1-q)^{n-m}(1-p)^np\\ &= \frac{p (1-p)^m q^m}{ (p+q -pq)^{m+1}}. \end{align} We compute the mean of $Y$ by conditioning on $X$: \begin{align} \mathbb E[Y] &= \mathbb E[\mathbb E[Y\mid X]]\\ &= \mathbb E[qX]\\ &= q\mathbb E[X]\\ &= \frac{q(1-p)}p. \end{align} The variance of $Y$ is given by the law of total variance: \begin{align} \operatorname{Var}(Y) &= \mathbb E[\operatorname{Var}(Y\mid X)] + \operatorname{Var}(\mathbb E[Y\mid X])\\ &=\mathbb E[q(1-q)X] + \operatorname{Var}(qX)\\ &=q(1-q)\mathbb E[X] + q^2\operatorname{Var}(X)\\ &=\frac{q(1-q)(1-p)}p + \frac{q^2(1-p)}{p^2}\\ &= \frac{(1-p) \left(pq+q^2-pq^2\right)}{p^2}. \end{align}