Problem in calculating Fisher information of some distribution

98 Views Asked by At

Let a probability distribution parameterized by $\theta\in\mathbb R^2$ be defined on $\mathbb R^2$ by the density function

$$f_\theta(z)=\frac1Z\exp(-\frac12(z-A\theta)^TV^{-1}(z-A\theta))$$

Where $Z$ is just whatever normalization constant is needed to make this a density function, and

$$A=\left[ {\begin{array}{cc} 1 & 1 \\ 1 & -1 \\ \end{array} } \right], V=\left[ {\begin{array}{cc} 1 & \rho \\ \rho & 1 \\ \end{array} } \right]$$

I'm supposed to find that the Fisher information matrix is diagonal. Now, if I look at the partial derivatives of $\log f_\theta$, I get (after dropping the constant factor $-\frac12$):

$$ \begin{align}\partial_{\theta_i}\log f_\theta(z)&=\partial_{\theta_i}-\frac12\langle z-A\theta, V^{-1}(z-A\theta)\rangle \\ &=-\frac12(\langle -A_i,V^{-1}(z-A\theta)\rangle + \langle z-A\theta, -(V^{-1}A)_i\rangle) \end{align}$$

Where $B_i$ denotes the $i$-th column of $B$. Since $V$ is symmetric, this becomes $I_i(z):=\partial_{\theta_i}\log f_\theta(z)=\langle A_i, V^{-1}(z-A\theta)\rangle$.

Is all that correct? Because if it is, I don't see how I can show that $E(I_1(z) I_2(z))=0$, which is what I need for the matrix to be diagonal.

1

There are 1 best solutions below

0
On

$E(I_1I_2)=E[A_1^TV^{-1}(Z-A\theta)(Z-A\theta)^TV^{-1}A_2]$

Now note $E[(Z-A\theta)(Z-A\theta)^T]=V$ therefore $E(I_1I_2)=A_1^TV^{-1}A_2$, which is $0$