Showing the normal distribution has points of inflections at $x = \mu \pm \sigma$ and a maximum at $x = \mu$

5k Views Asked by At

$X \sim N(\mu, \sigma^2)$

I.e. the density of $X$ is the normal distribution.

I am looking to show that $f_X(x)$ has points of inflections at $x = \mu \pm \sigma$.

In my notes it says that we should work with $ln(f_X(x))$ instead of $f_X(x)$ directly as the answer will be equivalent as $ln$ is an increasing function.

When I get the first derivative of $ln(f_X(x))$ w.r.t. $x$ I get $\frac{-x + u}{\sigma^2}$. This implies a critical point of $x$.

Then the second derivative is $\frac{-1}{\sigma^2} < 0$ so $x$ is a maximum.

But how do I show the points of inflection are $x = \mu \pm \sigma$?

1

There are 1 best solutions below

0
On

You have to work with $f(x)$ for the points of inflection, not $\ln(f(x))$. When $f(x)$ is the pdf of a normal distribution, $\ln(f(x))$ is just a quadratic function of $x$, so it has no inflection points.

But dealing with $f(x)$ isn't hard. If $$f(x)=\frac{1}{\sigma\sqrt{2\pi}}e^{-(x-\mu)^2/(2\sigma^2)}$$ then $$f'(x)=\frac{1}{\sigma\sqrt{2\pi}}e^{-(x-\mu)^2/(2\sigma^2)}\cdot\frac{-x+\mu}{\sigma^2}$$ Therefore, using the product rule, $$f''(x)=\frac{1}{\sigma\sqrt{2\pi}}e^{-(x-\mu)^2/(2\sigma^2)}\cdot-\frac{1}{\sigma^2}+\left(\frac{-x+\mu}{\sigma^2}\right)\frac{1}{\sigma\sqrt{2\pi}}e^{-(x-\mu)^2/(2\sigma^2)}\cdot\frac{-x+\mu}{\sigma^2}$$ Factoring gives $$f''(x)=\underbrace{\frac{1}{\sigma\sqrt{2\pi}}e^{-(x-\mu)^2/(2\sigma^2)}}\left(-\frac{1}{\sigma^2}+\left(\frac{-x+\mu}{\sigma^2}\right)^2\right)$$ Can you take it from here? (Hint: the underlined expression is never zero.)

For some context: taking the logarithm is a useful trick for first-order optimization, and you'll use it a lot when you get to maximum likelihood estimation. In that setting, you'll log a complicated joint pdf (usually a product of a bunch of univariate pdfs, assuming independence) in order to get something much easier to differentiate (and therefore maximize -- hence the maximum in "maximum likelihood"), namely the sum of a bunch of logs.

It's also worth understanding exactly why taking the log will yield the same critical points. Suppose $f(x)>0$. Then, with $g:=\ln(f(x))$, we have $$g'(x)=\frac{1}{f(x)}\cdot f'(x)$$ Clearly $g'$ has precisely the same roots as $f'$, and it always has the same sign, too; so a maximum of $f$ will be a maximum of $g$, etc.