Finding the Fisher's Information in a normal distribution with known $\mu$ and unknown $\sigma^{2}$

34.4k Views Asked by At

I have a point statistic $x_{i},...,x_{n} X \in N(\mu,\sigma^{2}), \mu$ is known.

I have to apply the Rao-cramer theorem but calculating the Fisher's information I stumbled upon this problem:

$I(\sigma)=-E(\frac{n}{\gamma}+3\sum(\frac{x_{i}-\mu)^{2}}{\sigma^{2}})=\frac{n}{\gamma}+ 3\frac{E(\sum(x_{i}-\mu)^{2})}{E\sigma^{4}}=\frac{n}{\gamma}+\frac{3}{\sigma^{4}}E(\sum(x_{i}-\mu)^{2})$

$$E(\sum(x_{i}-\mu)^{2})=?$$ ${\displaystyle \operatorname {E} [X]=\int _{\mathbb {R} }xf(x)\,dx.}$ But what is f(x) here ? Could it possibly be the function itself ${\displaystyle \operatorname {E} [X]=\int _{\mathbb {R} }x∑(x_i−μ)2\,dx.}$

From wiki, we know that Fisher's information is: $$( \begin{matrix} \frac{1}{\sigma^{2}} & 0 \\ 0 & \frac{1}{2\sigma^{4}} \end{matrix}) $$

But I need a number, what is that matrix supposed to mean?

What is $I(\sigma^{2})$ for a normal distribution with $\mu$ - known and $\sigma^{2}$- unknown?

2

There are 2 best solutions below

0
On

Let $\sigma ^ 2 = \theta $, thus $ X \sim N( \mu, \theta)$, hence $$ f_X(x; \theta) = \frac{1}{\sqrt{ 2 \pi \theta }} \exp\left( \frac {- (x - \mu ) ^ 2} { 2\theta} \right), $$

$$ l(\theta) = - \tfrac 1 2 \ln \theta - \frac {(x - \mu )^2} {2\theta} + \text {constant} $$ $$ l'(\theta) = -\frac{1}{2\theta} + \frac{(x- \mu) ^2}{2\theta ^ 2} $$ $$ - \mathbb{E} l'' (\theta) = - \mathbb{E}[ \frac{1}{2\theta ^ 2} - \frac{(x- \mu) ^2}{\theta ^ 3} ] = -\frac{1}{2\theta ^ 2} + \frac{1}{\theta^2} = \frac{1}{2 \theta ^ 2}. $$ Use the additive property of Fisher's information to get the Info. for sample of size $n$, i.e., $$ I_{X_1,...,X_n}(\theta) = \frac{n}{2\theta ^ 2} = \frac{n}{2\sigma ^ 4}, $$ for the observed information replace $\sigma ^2 $ with $$ S ^ 2 = \frac{\sum_{i=1}^n ( X_i - \mu) ^ 2}{n}. $$ (And note that $\operatorname{var}(X) = \mathbb{E}(X - \mu ) ^2 = \sigma ^ 2$).

0
On

Fisher information is only a matrix when we have 2 or more unknown parameters. Since $\mu$ is known, we will get a single number.

For a random sample $X_1, \dots, X_n$, the Fisher information for the sample can be defined as

$$I_X(\theta)=-n \operatorname{E}\left[\frac{\partial^2\ln f(X|\theta)}{\partial \theta^2}\right]$$

where here $\theta = \sigma^2$. So we don't need to consider the individual $X_i$.