Taylor expansion for sequences

88 Views Asked by At

Let $\theta$, $\{\theta_n\}_{n=1}^{\infty}$ and $\{X_n\}_{n=1}^{\infty}$ be a real number, a sequence of real numbers, and a sequence of random variables, respectively, satisfying that $\theta_n\rightarrow \theta$ and $X_n\overset{p}{\longrightarrow}\theta$. Let $f(.)$ be a differentiable function. By Taylor expansion, we have

$f(\theta_n) = f(\theta) + f'(\theta)(\theta_n-\theta) + o(|\theta_n-\theta|).$

However, is the following generalization also correct?

$f(X_n) = f(\theta_n) + f'(\theta_n)(X_n-\theta_n) + o_p(\|X_n-\theta_n\|)$

1

There are 1 best solutions below

1
On

I am assuming that the generalization is rather $$f(X_n) = f(\theta) + f'(\theta)(X_n - \theta) + o_p(X_n - \theta).$$ In that case, notice that by assumption, there exists a function $g$ such that $$f(\theta_n) = f(\theta) + f'(\theta)(\theta_n - \theta) + (\theta_n - \theta)g(\theta_n - \theta)$$ with $\lim_{x \to 0} g(x) = 0$. Applying this to $X_n$, we get $$f(X_n) = f(\theta) + f'(\theta)(X_n - \theta) + (X_n - \theta)g(X_n - \theta)$$ and the proof will be complete if we are able to prove that $g(X_n - \theta) \to 0$ in probability. Now fix $\epsilon >0$, then there exists $\delta >0$ such that $|g(x)| < \epsilon$ for $|x| < \delta$. So $$\mathbb{P}(|g(X_n - \theta)| \geq \epsilon) \leq \mathbb{P}(|X_n - \theta| \geq \delta) \to 0 $$ since $X_n \to \delta$ in probability. The inequality is due to the inclusion $\{|g(X_n - \theta)| \geq \epsilon\} \subset \{|X_n - \theta| \geq \delta\}$.