Fisher information of sample

122 Views Asked by At

Definition of single random variable Fisher Information is $$I(\theta)=E_\theta((\frac{\partial}{\partial \theta} ln f(x,\theta))^2)$$ and we get that $$E_\theta((\frac{\partial}{\partial \theta} ln f(x,\theta))^2)=-E_\theta(\frac{\partial^2}{\partial \theta^2}ln f(x,\theta))$$ If we have $X_1,X_2,\dots,X_n$ iid random variables the definition of Information is $$E_\theta((\sum_{i=1}^n \frac{\partial}{\partial \theta} ln f(x_i;\theta))^2)$$ and my question is how to proof that $$E_\theta((\sum_{i=1}^n \frac{\partial}{\partial \theta} ln f(x_i;\theta))^2)=-E_\theta(\frac{\partial^2}{\partial \theta^2} \sum_{i=1}^n ln f(x_i,\theta))$$

1

There are 1 best solutions below

0
On

You just need to take square inside each Xi, you will get the result. Here is the proof. (image with formulas)