$X_{1} \sim B(n_{1},p_{1})$ and $X_{2} \sim B(n_{2},p_{2})$ are independent. Let $Y = X_{1} + X_{2}$. I have worked out the moment generating function to be $$M_{Y}(t) = (p_{1}e^{t} + 1 - p_{1})^{n_{1}}(p_{2}e^{t}+1-p_{2})^{n_{2}}$$
Now to find the expectation of $Y$:
$$E[Y] = \frac{\mathrm{d}M_{Y}(t)}{\mathrm{d}t}\left.\right|_{t = 0} = \frac{\mathrm{d}}{\mathrm{d}t}\left[(p_{1}e^{t} + 1 - p_{1})^{n_{1}}(p_{2}e^{t} + 1 - p_{2})^{n_{2}}\right]_{t=0}$$
$$= n_{1}p_{1}e^{t}(p_{1}e^{t}+1-p_{1})^{n_{1}-1}(p_{2}e^{t} + 1 - p_{2})^{n_{2}} + n_{2}p_{2}e^{t}(p_{1}e^{t}+1-p_{1})^{n_{1}}(p_{2}e^{t}+1-p_{2})^{n_{2}-1}\left.\right|_{t=0}$$
$$= e^{t}(p_{1}e^{t}+1-p_{1})^{n_{1}-1}(p_{2}e^{t}+1-p_{2})^{n_{2}-1}(n_{1}p_{1}(p_{2}e^{t}+1-p_{2})+n_{2}p_{2}(p_{1}e^{t}+1-p_{1}))\left.\right|_{t=0}$$
$$= e^{t}(p_{1}e^{t}+1-p_{1})^{n_{1}-1}(p_{2}e^{t}+1-p_{2})^{n_{2}-1}(n_{1}p_{1}p_{2}e^{t} + n_{1}p_{1} - n_{1}p_{1}p_{2} + n_{2}p_{1}p_{2}e^{t} + n_{2}p_{2} - n_{2}p_{1}p_{2})\left.\right|_{t=0}$$
$$=n_{1}p_{1} + n_{2}p_{2}$$
Now for $E[Y^{2}]$ we must differentiate $\frac{\mathrm{d}M_{Y}(t)}{\mathrm{d}t}$ which is clearly a tedious job, is there a way to skip this?
Set $X_{1}=U_{1}+\cdots+U_{n_{1}}$ and $X_{2}=V_{1}+\cdots+V_{n_{2}}$ where the $U_{i}$ and $V_{j}$ are independent and Bernouilli-distributed with parameters $p_{1}$ for the $U_{i}$ and $p_{2}$ for the $V_{j}$.
Then $Y=U_{1}+\cdots+U_{n_{1}}+V_{1}+\cdots+V_{n_{2}}$. Now work out $Y^{2}$ and find $\mathbb{E}$Y$^{2}$ as a sum of expectations that are not difficult to find.
Likewise you could find $\mathbb E Y$ on an easy way.
Another route is making use of $\mathbb{E}Y^{2}-\left(\mathbb{E}Y\right)^{2}=\text{Var}Y=\text{Var}X_{1}+\text{Var}X_{2}$