If $X$ and $Y$ are two independent random variables, with regular distributions, how can I prove
$I_{x,y}(\theta) = I_x(\theta) + I_y(\theta)$ ?
Thanks!
I tried:
$$ {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{xy} (x,y)\right)^2 \right] ={\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{x} (x)\right)^2 \right] + {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{y} (y)\right)^2 \right] + {\rm E}_\theta \left[\left( \frac {\partial}{\partial\theta} \log f_\theta^{y} (y)\right) \left( \frac {\partial}{\partial\theta} \log f_\theta^{x} (x)\right) \right] $$
and the last element should be equal to zero.
First of all if $X$ and $Y$ are two independent random variables, then $$f_{(X,Y)}(x,y;\theta)=f_X(x;\theta)\cdot f_Y(y;\theta)$$ $$\log (f_{(X,Y)}(x,y;\theta))=\log(f_X(x;\theta))+ \log(f_Y(y;\theta))$$ As It was mentioned under certain regularity conditions: $$\mathcal{I}(\theta) =\operatorname{E} \left[\left. \left(\frac{\partial}{\partial\theta} \log f(X;\theta)\right)^2\right|\theta \right] = - \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log f(X;\theta)\right|\theta \right]\,$$ So $$ \begin{eqnarray} \mathcal{I}_{XY}(\theta) &=&- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} (\log(f_X(x;\theta))+ \log(f_Y(y;\theta))\right|\theta \right]\,=\\&=&- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log(f_X(x;\theta))\right|\theta \right]\,- \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta^2} \log(f_Y(y;\theta))\right|\theta \right]\,=\\ &=&\mathcal{I}_X(\theta)+\mathcal{I}_Y(\theta) \end{eqnarray} $$