Delta Method when sequence might not converge

72 Views Asked by At

The delta method says that, if $g$ is differentiable at $\theta$, $\{X_n\}$, $Y$ are random variables, and $a_n \to \infty$ as $n \to \infty$, we have

$a_n (X_n - \theta) \xrightarrow{D}\ Y \implies a_n(g(X_n) - g(\theta) ) \xrightarrow{D}\ \nabla g(\theta)^T Y$

However, does it apply if convergence to $Y$ is replaced by being bound in probability? Is it true that

$a_n (X_n - \theta) = Op(1) \implies a_n(g(X_n) - g(\theta) ) = g(\theta)^T Op(1)$

where $g(\theta)^T Op(1) = Op(1)$ if $g(\theta) \ne 0$, and $g(\theta)^T Op(1) = op(1)$ if $g(\theta) = 0$?

1

There are 1 best solutions below

0
On BEST ANSWER

Yes; the error of approximation goes to zero regardless. Check the wikipedia page for the delta method for aa "Proof with an explicit order of approximation".