If $$X \sim \mathscr N(\mu_1,\sigma_1^2)$$ $$Y \sim \mathscr N(\mu_2,\sigma_2^2)$$ $X$ and $Y$ are independent, how to show that $$X+Y\sim \mathscr N(\mu_1+\mu_2,\sigma_1^2+\sigma_2^2)$$
Hint: $$\frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty e^{-\frac{x^2}{2}}dx=1 $$
The simplest way is perhaps to use the Moment Generating Function or the Characteristic Function of an $\mathcal{N}(\mu,\sigma)$ random variable.
$$\mathbb{E}[e^{s(X+Y)}] = \mathbb{E}[e^{sX}]\mathbb{E}[e^{sY}]$$ by independence of $X$ and $Y$.
Since $\mathbb{E}[e^{sX}]\mathbb{E}[e^{sY}] = e^{s\mu_1 + \tfrac{s^2}{2}\sigma_1^2}\bigg[e^{s\mu_2 + \tfrac{s^2}{2}\sigma_2^2}\bigg] = e^{s(\mu_1+\mu_2) + \tfrac{s^2}{2}(\sigma_1^2+\sigma_2^2)}$,
The result follows by uniqueness of the MGF (when it exists, which it does for Normal Random Variables).