I am calculating the variance of a standard normal distribution. I know there are other ways to do this, but I want to use the gamma distribution. I calculated up to $\frac{2}{\sqrt \pi}\Gamma (\frac{3}{2})$. But I don't understand why the following calculation results in 1. Please give me some hint for
$$\frac{2}{\sqrt \pi}\Gamma (\frac{3}{2})=1$$
You can brute force compute from the definition of $\Gamma(x) = \int_0^\infty t^{x - 1} e^{-t} \ dt$ $$ \Gamma\Big(\frac{3}{2}\Big) = \int_0^\infty t^{\frac{1}{2}} e^{-t} \ dt = -\underbrace{ \Big[t^{\frac{1}{2}}e^{-t}\Big]_0^\infty }_{= 0} + \int_0^\infty \frac{1}{2}t^{-\frac{1}{2}} e^{-t} \ dt = \int_0^\infty e^{-s^2} \ ds = \frac{\sqrt{\pi}}{2} $$