Fisher's information for two independent random variables

2.2k Views Asked by At

If $X$ and $Y$ are two independent random variables, with regular distributions, how can I prove

$I_{x,y}(\theta) = I_x(\theta) + I_y(\theta)$ ?

Thanks!

I tried:

$$ {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{xy} (x,y)\right)^2 \right] ={\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{x} (x)\right)^2 \right] + {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{y} (y)\right)^2 \right] + {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{y} (y)\right) \left( \frac {\partial}{\partial\theta} \log f_\theta^{x} (x)\right) \right] $$

and the last element should be equal to zero.

2

There are 2 best solutions below

2
On BEST ANSWER

First of all if $X$ and $Y$ are two independent random variables, then $$f_{(X,Y)}(x,y;\theta)=f_X(x;\theta)\cdot f_Y(y;\theta)$$ $$\log (f_{(X,Y)}(x,y;\theta))=\log(f_X(x;\theta))+ \log(f_Y(y;\theta))$$ As It was mentioned under certain regularity conditions: $$\mathcal{I}(\theta) =\operatorname{E} \left[\left. \left(\frac{\partial}{\partial\theta} \log f(X;\theta)\right)^2\right|\theta \right] = - \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log f(X;\theta)\right|\theta \right]\,$$ So $$ \begin{eqnarray} \mathcal{I}_{XY}(\theta) &=&- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} (\log(f_X(x;\theta))+ \log(f_Y(y;\theta))\right|\theta \right]\,=\\&=&- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log(f_X(x;\theta))\right|\theta \right]\,- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log(f_Y(y;\theta))\right|\theta \right]\,=\\ &=&\mathcal{I}_X(\theta)+\mathcal{I}_Y(\theta) \end{eqnarray} $$

0
On

Suppose $X$ and $Y$ are independent random variables whose distribution depends on some parameter $\theta$. Then the density of $(X,Y)$ is the product of the marginal densities, i.e. $$ f_{(X,Y)}(x,y;\theta)=f_X(x;\theta)\cdot f_Y(y;\theta)\tag{1} $$ for all $(x,y)$ and $\theta$. The Fisher information of $(X,Y)$ is $$ I_{(X,Y)}(\theta)={\rm E}\left[\left(\frac{\partial}{\partial\theta}\log f_{(X,Y)}(X,Y;\theta)\right)^2\right]. $$ Under certain regularity conditions (which ensures that we may interchange ${\rm E}$ and $\frac{\partial}{\partial\theta}$) we have that $$ {\rm E}\left[\frac{\partial}{\partial\theta}\log f_{(X,Y)}(X,Y;\theta)\right]=0 $$ and hence $$ I_{(X,Y)}(\theta)=\mathrm{Var}\left(\frac{\partial}{\partial\theta}\log f_{(X,Y)}(X,Y;\theta)\right). $$ Use this to conclude.