Finding unknown variance in order to maximize probability given a Gaussian distribution

114 Views Asked by At

Given a Gaussian (Normal) Random Variable that has a known mean, μ, and unknown variance $\sigma^2,$ how would we determine the variance in order to maximize the probability, $P[X_1<X<X_2]?$

The way I think about this is in order to maximize the probability, I set the derivative with respect to sigma of, $${\int\displaystyle {\frac {1}{\sigma {\sqrt {2\pi }}}}e^{-{\frac {1}{2}}\left({\frac {x-\mu }{\sigma }}\right)^{2}}}$$ equal to 0. However, I don't know where to go after that.

1

There are 1 best solutions below

2
On

Letting $Z \sim N(0,1)$ and noting that $Z$ and $\frac{X-\mu}{\sigma}$ are equal in distribution, then

\begin{align*} P(x_1 \le X \le X_2) &= P \left (\frac{x_1-\mu}{\sigma} \le \frac{X-\mu}{\sigma} \le \frac{x_2-\mu}{\sigma}\right)\\ &= P \left (\frac{x_1-\mu}{\sigma} \le Z \le \frac{x_2-\mu}{\sigma}\right)\\ &= \Phi \left ( \frac{x_2-\mu}{\sigma}\right) - \Phi \left ( \frac{x_1-\mu}{\sigma}\right) \end{align*}

where $\Phi$ denotes the cdf of the standard normal, i.e. $\Phi(z) = P(Z \le z)$. Now, differentiate this expression with respect to $\sigma$ and set equal to zero yields:

$$ 0 = -\frac{(x_2-\mu)}{\sigma^2} \phi \left ( \frac{x_2-\mu}{\sigma}\right) +\frac{(x_1-\mu)}{\sigma^2} \phi \left ( \frac{x_1-\mu}{\sigma}\right) $$ where $\phi(z) = \frac{1}{\sqrt{2 \pi}} e^{-z^2/2}$ denotes the pdf of the standard normal, and $\frac{d}{dz} \Phi(z) = \phi(z)$. Rearranging the above yields \begin{align*} &(x_2-\mu) \phi \left ( \frac{x_2-\mu}{\sigma}\right) =(x_1-\mu) \phi \left ( \frac{x_1-\mu}{\sigma}\right)\\ \implies & (x_2-\mu) \frac{1}{\sqrt{2 \pi}} \exp \left \{ -\frac{1}{2 \sigma^2} (x_2-\mu)^2 \right \} =(x_1-\mu) \frac{1}{\sqrt{2 \pi}} \exp \left \{ -\frac{1}{2 \sigma^2} (x_1-\mu)^2 \right \}\\ \implies & \frac{x_2-\mu}{x_1 - \mu} = \frac{\exp \left \{ -\frac{1}{2 \sigma^2} (x_1-\mu)^2 \right \}}{ \exp \left \{ -\frac{1}{2 \sigma^2} (x_2-\mu)^2 \right \} }\\ \implies & \frac{x_2-\mu}{x_1 - \mu} = \exp \left \{ -\frac{1}{2 \sigma^2}[ (x_1-\mu)^2 - (x_2-\mu)^2] \right \}\\ \implies & \log \left ( \frac{x_2-\mu}{x_1 - \mu} \right) = -\frac{1}{2 \sigma^2}[ (x_1-\mu)^2 - (x_2-\mu)^2]\\ \implies & \sigma^2 = \frac{- [ (x_1-\mu)^2 - (x_2-\mu)^2]}{ \log \left ( \frac{x_2-\mu}{x_1 - \mu} \right)} \end{align*}

and as noted in the comment, we must have $\frac{x_2-\mu}{x_1-\mu}>0$ for the final expression to be defined