How do these results show that $T(\mathbf{X})$ is an unbiased estimator of $E_\varphi[T(\mathbf{X})]$ that achieves the Cramer-Rao lower bound?

347 Views Asked by At

Let's say that $X_1, \dots, X_n$ has the joint distribution $f_\varphi(\mathbf{x})$ that belongs to the one-parameter exponential family

$$f_\varphi(\mathbf{x}) = \exp{\left\{ c(\varphi) T(\mathbf{x}) + d(\varphi) + s(\mathbf{x}) \right\}},$$

where $\mathbf{x} \in \text{supp}(f_\varphi)$, $\text{supp}(f_\varphi)$ does not depend on $\varphi$, and $c^\prime(\varphi)$ is continuous and does not vanish.

I am told that the results

$$E_\varphi [T(\mathbf{X})] = - \dfrac{d^\prime (\varphi)}{c^\prime(\varphi)}$$

and

$$\text{Var}_\varphi [T(\mathbf{X})] = \dfrac{c^ {\prime \prime}(\varphi) d^\prime(\varphi) - c^\prime(\varphi)d^{\prime \prime}(\varphi)}{c^\prime(\varphi)^3}$$

can be used to show that $T(\mathbf{X})$ is an unbiased estimator of $E_\varphi[T(\mathbf{X})]$ that achieves the Cramer-Rao lower bound. However, it is not at all clear to me how this is done. How do the results $E_\varphi [T(\mathbf{X})] = - \dfrac{d^\prime (\varphi)}{c^\prime(\varphi)}$ and $\text{Var}_\varphi [T(\mathbf{X})] = \dfrac{c^ {\prime \prime}(\varphi) d^\prime(\varphi) - c^\prime(\varphi)d^{\prime \prime}(\varphi)}{c^\prime(\varphi)^3}$ show that $T(\mathbf{X})$ is an unbiased estimator of $E_\varphi[T(\mathbf{X})]$ that achieves the Cramer-Rao lower bound?

1

There are 1 best solutions below

0
On BEST ANSWER

Define $\lambda_n(\textbf x|\varphi)=\log f(\textbf x|\varphi)=c(\varphi)T(\textbf x)+d(\varphi)+s(\textbf x)$. Then $\lambda'_n(\textbf x|\varphi)=c'(\varphi)T(\textbf x)+d'(\varphi)$ and $\lambda_n''(\textbf x|\varphi)=c''(\varphi)T(\textbf x)+d''(\varphi)$. The Fisher information for the sample is $I_n(\varphi)=-E(\lambda_n''(\textbf x|\varphi))=-\left(-c''(\varphi)\frac{d'(\varphi)}{c'(\varphi)}+d''(\varphi)\right)=\frac{c''(\varphi)d'(\varphi)-d''(\varphi)c'(\varphi)}{c'(\varphi)}$. Also define $m(\varphi)=E_\varphi(T)$ so that $m'(\varphi)=-\frac{c'd''-d'c''}{c'^2}=\frac{d'c''-c'd''}{c'^2}$. The Cramer-Rao Inequality says that $$Var_\varphi(T)\ge\frac{[m'(\varphi)]^2}{I_n(\varphi)}=\frac{(d'c''-c'd'')^2}{(c'^2)^2}\cdot\frac{c'}{c''d'-d''c'}=\frac{d'(\varphi)c''(\varphi)-c'(\varphi)d''(\varphi)}{c'(\varphi)^3}$$ So that the Variance of the estimator has the lower bound in the information inequality. By the way, $T(X)$ is an unbiased estimator of $E(T(X))=-\frac{d'(\varphi)}{c'(\varphi)}$ by definition. Thus T(X) is an unbiased estimator for it that also has the lower bound in variance for Cramer-Rao.