Knowing that a one-dimensional random variable $\Gamma$ is Gaussian if it has the characteristic function $$\mathbb{E}\hspace{0.15cm}e^{i\xi\Gamma}=e^{im\xi-\frac{1}{2}\sigma^2\xi^2}\tag{1}$$ for some real numbers $m\in\mathbb{R}$ and $\sigma\geq0$. If we differentiate $(1)$ two times with respect to $\xi$ and set $\xi=0$, we see that $$m=\mathbb{E}\hspace{0.15cm}\Gamma\hspace{0.3cm}\sigma^2=\mathbb{V}\hspace{0.15cm}\Gamma\tag{2}$$
I cannot understand how to get to $(2)$ by double differentiating $(1)$ with respect to $\xi$, setting $\xi=0$.
If I differentiate r.h.s. of $(1)$ two times with respect to $\xi$ and I set $\xi=0$, I get $-m^2$.
This question, the other answer, and the comment discussion are muddled. For a random variable $X$ with finite second moment, with characteristic function $\varphi_X(t)=E[e^{itX}]$, the following are true: $$ \varphi_X(t)=1+ iE[X]t -\frac{E[X^2]}2 t^2+o(t^2)\\E[X] =\mu = -i\varphi_X'(0)\\E[X^2] = \mu^2+\sigma^2= -\varphi_X''0).$$ If you define $\psi=\log\varphi_X$, then $$\psi(t)=\mu it-\sigma^2\frac{t^2}2+o(t^2)\\ \mu = -i \psi'(0)\\ \sigma^2 = -\psi''(0),$$ which can be connected with the previous equations by application of the chain rule and product rule: $$\varphi_X(t)=e^{\psi(t)}\\ \varphi_X'(t) = e^{\psi(t)} \psi'(t)\\ \varphi_X''(t) = e^{\psi(t)}( \psi'(t))^2 + e^{\psi(t)}\psi''(t)$$$$ \varphi_X'(0) = e^{\psi(0)} \psi'(0)=\psi'(0)=-i\mu\\ \varphi_X''(0) = e^{\psi(0)}( \psi'(0))^2 + e^{\psi(0)}\psi''(0)=-\mu^2-\sigma^2,$$ and so on.