Let $U$ and $V$ be independent random variables both having Bernoulli distribution, i.e. $U$ ~ Bernoulli($0.2$) and $V$ ~ Bernoulli($0.2$)
Let $W = U \cdot V$
(a) Compute $E(W)$
(b) Compute $E(UW)$
My attempt:
$(a)$
$$E(W) = E(UV) = E(U)E(V)$$
Because they're independent
$E(U) = E(V) = 0.2$, so $E(W) = 0.04$
$(b)$
$E(UW) = E(UUW) = E(U^2V) = E(U^2V) = E(U^2)E(V)$
$E(U^2) = 0.2^2 = 0.04$ [Not sure about this]
Thus,
$E(UW) = 0.04 \cdot 0.2 = 0.008$
Note that
\begin{align} E(U^2) & = 0 \cdot P(U^2 = 0) + 1 \cdot P(U^2 = 1) \\ & = 0 \cdot P(U = 0) + 1 \cdot P(U = 1) \qquad \leftarrow \text{because $U$ is either $0$ or $1$} \\ & = P(U = 1) \end{align}