Let $x$ be a scalar random variable. There is a theorem that states that if $E[\exp(ixs)]= \exp\Big( i{s}\mu - \tfrac{1}{2} {\sigma^2s^2} \Big)$ for some neighborhood around the origin (i.e. $|s|<\delta$ for some $\delta>0$) then ${x}$ is normal (e.g. Lukacs, 1970, Chapter 7).
The question is if this holds for multivariate $\mathbf{x}$. Specifically, if $E[\exp(i\mathbf{s}'\mathbf{x})]= \exp\Big( i\mathbf{s}'\boldsymbol\mu - \tfrac{1}{2} \mathbf{s}'\boldsymbol\Sigma \mathbf{s} \Big)$ for some neighborhood around the origin then does $\mathbf{x}$ have a multivariate normal distribution?
Sure, because being multivariate normal means exactly that every linear combination of the entries is normal and the one-dimensional result you recalled shows that every linear combination of the entries of $\mathbf x$ is normal.