likelihood and fisher information of two dependent variables

48 Views Asked by At

Let $\{X, Y\}$ be two sets of variables that depend on the parameter $\theta$. Let $z_1 = f_1(X,Y), z_2 = f_2(X,Y)$ be two variables constructed from $\{X, Y\}$. The functions $f_i$'s are known. $I_{z_1, z_2}(\theta) $ and $I_{X, Y}(\theta)$ are the Fisher information from $\{z_1, z_2\}$ and $\{X, Y\}$ variables.

We can write $$I_{z_1, z_2}(\theta) \leq I_{X, Y}(\theta)$$ where the equality holds when $z_i$'s are sufficient statistic (source). How do we extract (i) the likelihood $p(z_1,z_2)$ and Fisher information (ii) $I_{z_1, z_2}(\theta)$ from the primary variables' likelihood and information – $p(X,Y)$ and $I_{X, Y}(\theta)$?

1

There are 1 best solutions below

5
On

Let us recall the definition of an sufficient statistic. Given a model $(P_{\theta}(dw))_{\theta\in \Theta}=e^{\ell_w(\theta)}\nu(dw)$ on $\Omega$, consider a map $v=\varphi(w)$ from $\Omega$ to some mesurable space $(\Omega_1,\mathcal{A}_1)$. Denote by $P_{\theta}(dw,dv)$ the joint distribution of $(w,\varphi(w))$ in $\Omega\times\Omega_1$ (thus actually concentrated on the 'curve' $\{(w,v)\ ; v=\varphi(w)\}.$ We desintegrate \begin{equation}\label{Q}P_{\theta}(dw,dv)=H_{\theta}(dv)K_{\theta}(v,dw)\end{equation} where $H_{\theta}(dv)$ is the marginal distribution of the random variable $V=\varphi(w)$ and where the transition probability kernel $K_{\theta}(v,dw)$ is the conditional distribution of $w$ given $V=v.$ We shall say that $\varphi$ is an sufficient statistic if $\theta\mapsto K_{\theta}$ is a constant.

Proposition: Consider a Fisher model $(P_{\theta}(dw))_{\theta\in \Theta}$ on $\Omega$. Consider also a measurable map $v=\varphi(w)$ from $\Omega$ to some measurable space $(\Omega_1,\mathcal{A}_1)$. Suppose that the image $H(dv)$ of $\nu(dw)$ by $w\mapsto v= \varphi(w)$ exists. Consider the image $(H_{\theta}(dv))_{\theta\in \Theta}$ on $\Omega_1$ of $(P_{\theta}(dw))_{\theta\in \Theta}$ by $w\mapsto v= \varphi(w).$ Then $(H_{\theta})_{\theta\in \Theta}$ is a Fisher model, with $H_{\theta}(dv)=e^{h_v(\theta)}H(dv).$ Let $J(\theta)$ be its information matrix. Then $$I(\theta)-J(\theta)=\int _{\Omega}[(\ell_w'(\theta)-h_{\varphi(w)}'(\theta))\otimes (\ell_w'(\theta)-h_{\varphi(w)}'(\theta))]P_{\theta}(dw).$$ In particular $I(\theta)-J(\theta)$ is semi positive definite. Furthermore $I(\theta)-J(\theta)=0$ for all $\theta\in \Theta$ if and only if $\varphi$ is an sufficient statistic.