Understanding the Delta Method and Sufficiency of Estimators

165 Views Asked by At

Suppose $X_i$ are a random sample from some distribution with parameter $\theta$. So the $X_i$ are independent and identically distributed. Suppose I have some completely arbitrary estimator $\hat{\theta}_n$ that is sufficient for a single parameter $\theta$ of this distribution. Now suppose I have some continuous function $g(\theta)$.

Is it true that:

$$\sqrt{n}(g(\hat{\theta}_n)-g(\theta))\to^d N(0, V(X_i)(g'(\theta))^2)$$

where $\to^d$ represents convergence in distribution? If this is not true as stated, are there other conditions that need to be fulfilled in order to make it true? If it is true as stated, have I stated extraneous conditions; for example, is it necessary that $\hat{\theta}_n$ is sufficient for $\theta$?

1

There are 1 best solutions below

2
On

There is no connection between sufficiency and the desired convergence. It requires other conditions:

1) Let $\hat{\theta}_n$ is asymptotically normal estimate for $\theta$ with some coefficient $\sigma^2(\theta)\neq 0$, namely $$\sqrt{n}(\hat{\theta}_n-\theta)\xrightarrow{d} N(0, \sigma^2(\theta)).$$ Note that $\sigma^2(\theta)$ in general does not equal to $\mathop{Var}(X_1)$.

2) Let also $g$ be differentiable at the point $\theta$ with $g'(\theta)\neq 0$. Then $$ \sqrt{n}(g(\hat{\theta}_n)-g(\theta))\xrightarrow{d} N(0, \sigma^2(\theta) (g'(\theta))^2). $$

There is no need for sufficient statistics to be an asymptotically normal estimate of a parameter. It does not have to possess even the property of consistency. For this cases, $g(\hat{\theta}_n)-g(\theta)$ does not converge to zero in probability, and $\sqrt{n}(g(\hat{\theta}_n)-g(\theta))$ diverges.

Say, for Uniform $(0,\theta)$ distribution one-dimensional sufficient statistics for $\theta$ are: $X_{(n)}=\max\{X_1,\ldots,X_n\}$, $n^2X_{(n)}$, $\frac{X_{(n)}}{n^{25}}$ and so on. Only the first of these estimates is consistent, but it is not asymptotically normal.

For normal distribution $N(0,\theta)$ with variance $\theta$ any of the following statistics is sufficient: $\sum_{i=1}^n X_i^2$, $\left(\sum_{i=1}^n X_i^2\right)^2$, $\frac{\sum_{i=1}^n X_i^2}{n^7}$ and so on. None of these statistics is even a consistent estimate.

From the other side, the sufficient statistics $$\frac{\sum_{i=1}^n X_i^2}{n}$$ is asymptotically normal: $$ \sqrt{n}\left(\frac{\sum_{i=1}^n X_i^2}{n}-\theta\right) \xrightarrow{d} N(0,\mathop{Var}(X_1^2))= N(0,2\theta^2) $$ Here the asymptotic variance is $\mathop{Var}(X_1^2)\neq \mathop{Var}(X_1)$.