A decomposition of Fisher Information: is $I_{\mathbf X,T(\mathbf X)}=I_{\mathbf X}$ true?

98 Views Asked by At

Suppose $\mathbf X=(X_1,\dots,X_N)$ where $X_n\sim f_X(\theta)$ are iid random variables parameterized in terms of $\theta$. Furthermore, denote the Fisher information by $$ I_{\mathbf X}(\theta)=-\mathsf E(\nabla_\theta\nabla_\theta^T\log f_X(\mathbf X|\theta)). $$ and $$ T(\mathbf X)=\left(\frac{1}{N}\sum_{n=1}^NX_n,\frac{1}{N}\sum_{n=1}^N(X_n-\bar X)^2\right) $$ to be the sample mean and variance.

Can we decompose $I_{\mathbf X}$ as $$ \tag{1} I_{\mathbf X}(\theta)=I_{T(\mathbf X)}(\theta)+I_{\mathbf X|T(\mathbf X)}(\theta)? $$ If so, how would this decomposition be derived?


I am aware of the decomposition for jointly distributed variables $(X,Y)$: $$ I_{X,Y}(\theta)=I_{Y}(\theta)+I_{X|Y}(\theta), $$ which is easily derived since the information is based on a logarithm. However, in the contect of my question, shouldn't $I_{\mathbf X,T(\mathbf X)}=I_{\mathbf X}$ since knowledge of $T(\mathbf X)$ does not add additional information?

1

There are 1 best solutions below

2
On BEST ANSWER

If $U(X)$ is a statistic, it holds that $I_{U(X)}=I_X$ if $U(X)$ is sufficient. Now consider the statistic given by $U(x)=x$; of course $U(X)$ is sufficient since $P(X\in A|U(X)=u)=\mathbf{1}_A(u)$ does not depend on $P$ (i.e. on $\theta$). Now consider $(X,T(X))$ where $T(X)$ is another statistic. Then $X=U(X)=\pi_1(X,T(X))$, which implies that $(X,T(X))$ is sufficient. So $I_{(X,T(X))}=I_X$, regardless of $T(X)$.