Log likelihood of I.I.D normal distributed random variables

716 Views Asked by At

I am given $X_1,\ldots,X_n$ I.I.D random variables, where $X_i\sim N(\theta,\theta ^2)$. I am asked to find the loglikelihood function of this a long with the first two derivates of said function, as well as the Fisher Information $i(\theta)$.

So far what I have got is the following. Seeing as the density for the simultaneous outcome is given as the product of the marginal densitites, all alike, I have;

$L_X(\theta)=\frac{1}{\left(2\pi \theta^2\right)^{n/2}}e^{-\frac{1}{2\theta^2}\sum_{i=1}^n (x_i- \theta)^2}$

And thus the loglikelihoodfunction; (in my course we do $-log(L_X(\theta))$ )

$\ell_X(\theta)=\frac{n}{2}log(\sqrt{2\pi})+\frac{n}{2}log(\theta^2)+\frac{1}{2\theta^2}\sum_{i=1}^n (x_i- \theta)^2$

And the first two derivates

$D\ell_X(\theta)=\frac{n}{\theta}-\frac{1}{\theta^2}\sum_{i=1}^n x_i-\frac{1}{\theta^3}\sum_{i=1}^n x_i^2$

$D^2\ell_X(\theta)=-\frac{n}{\theta^2}+\frac{2}{\theta^3}\sum_{i=1}^n x_i+\frac{3}{\theta^4}\sum_{i=1}^n x_i^2$

But now I need to decide $i(\theta)=E_{\theta}D^2\ell_X(\theta)$, but that involves saying something about the mean of the independent random variables $X_i^2$, and I am stuggling to find the distribution of these. I know that if $X_i\sim N(0,1)$, $X_i^2$ would be Chi-squared distributed with 1 degree of freedom, but that is not the case here.

The whole point is that I need to decide the Maximum Likelihood Estimator for $\theta$, seeing as the mean and variance of my random variables are so closely and obviously related, and find its asymptotic distribution (smells like Cramér).

1

There are 1 best solutions below

2
On

You've almost got it. The Fischer Information is a function of $\theta$, the expectation is on $X$, not $\theta$, in the FI formula. The FI formula can be written as

$$ i(\theta)=-E_X\left[\frac{\delta^2}{\delta\theta^2}\ln(f(x;\theta))|\theta\right] $$