Variance of a Gaussian Random Variable

403 Views Asked by At

Show Variance of a Gaussian random variable $N(\mu,\sigma^2)$ and I know $\mathbb{E}(X)^2 = \mu^2$.

So I need $\mathbb{E}(X^2)$ = $\int_{\mathbb{R}} x^2 \frac{1}{\sqrt{2\pi\sigma^2}} e^\frac{-(x-\mu)^2}{2\sigma^2}dx$. The final result should be $\sigma^2 + \mu^2$, but I cannot see the integration trick used to do this.

1

There are 1 best solutions below

2
On BEST ANSWER

Hint: If you want to calculate it using an integral expression, it is actually easier to calculate the variance directly, i.e.

$$\text{var}(X) = \mathbb{E}((X-\mu)^2) = \int (x-\mu)^2 \frac{1}{\sqrt{2\pi \sigma^2}} \exp \left(- \frac{(x-\mu)^2}{2\sigma^2} \right) \, dx.$$

To this end, write

$$\sqrt{\frac{\sigma^2}{2\pi}} \int(x-\mu) \cdot \left[ \frac{(x-\mu)}{\sigma^2} \exp \left(- \frac{(x-\mu)^2}{2\sigma^2} \right) \right] \, dx$$

and use integration by parts.