The normalized Gaussian distribution is defined as:
$$N(x|\mu,\sigma^2)=\frac{1}{(2\pi\sigma^2)^{1/2}}exp\bigg(\frac{-1}{2\sigma^2}(x-\mu)^2\bigg)$$
Prove that:
$$Var[x]=E[(X-\mu)^2]=\sigma^2$$
by differentiating both sides of the normalization condition:
$$\int_{-\infty}^{\infty} N(x|\mu, \sigma^2)~dx = 1$$
with respect to $\sigma^2$ and rearrange the result such that:
$$E[(X-\mu)^2] = \sigma^2$$
I've tried this proof at least 10 times...and the variance always comes out as fraction of a square...
For simplicity let's suppress the $\mu$ and $\sigma$ in $N(x\mid \mu,\sigma)$. The fastest way to differentiate $N$ is to use logarithmic differentiation: $$ \log N = -\frac{(x-\mu)^2}{2\sigma^2} -\log(\sqrt{2\pi}\cdot\sigma).\tag1 $$ Now differentiate both sides with respect to $\sigma$: $$ \frac1N\frac{\partial N}{\partial\sigma}=\frac{(x-\mu)^2}{\sigma^3}-\frac1\sigma $$ so that $$\frac{\partial N}{\partial\sigma}=N(x)\left[\frac{(x-\mu)^2-\sigma^2}{\sigma^3}\right].\tag2$$ You should be able to take it from here, given that $\int N(x)(x-\mu)^2\,dx=E(X-\mu)^2$ and $\int N(x)\,dx=1$.
(You can also differentiate (1) wrt $\sigma^2$ by defining $t:=\sigma^2$ and replacing all occurrences of $\sigma$ by $t^{1/2}$, then computing $\frac\partial{\partial t}$. The same result should obtain.)
EDIT: Once you've established (2), here are the details:
$$0\stackrel{(a)}=\frac d{d\sigma}\int N(\sigma, x)\,dx \stackrel{(b)}=\int \frac{\partial N}{\partial \sigma}\,dx\stackrel{(c)}=\frac1{\sigma^3}\left[\int N(x)(x-\mu)^2dx-\int \sigma^2 N(x)dx\right] $$ Step (a) follows because $\int N(x)dx=1$. Step (b) is the Leibniz integral rule. In step (3) we substitute (2).