Convergence in distribution under special condition

50 Views Asked by At

Let $a\in \mathbb R $ and $v>0$ and $(X_n)_n$ a sequence of real valued random variables with $\sqrt n (X_n-a) \xrightarrow{d} N (0,v) $ for $n \to \infty $. Let $f:\mathbb R \to \mathbb R $ be differentiable in $a $ with $f '(a)^2 \cdot v=1 $ Show $\sqrt {n} (f (X_n)-f (a)) \xrightarrow{d} N (0,1), n \to \infty$

1

There are 1 best solutions below

3
On BEST ANSWER

By the Skorohod representation theorem, there exist a probability space $(\tilde{\Omega},\tilde{\mathcal{A}},\tilde{\mathbb{P}})$ and a sequence $(\tilde{X}_n)_{n \in \mathbb{N}}$ of random variables on $(\tilde{\Omega},\tilde{\mathcal{A}},\tilde{\mathbb{P}})$ such that $X_n \stackrel{d}{=} \tilde{X}_n$ for all $n$ and $$\sqrt{n} (\tilde{X}_n-a) \to G \quad \text{almost surely}$$ for some random variable $G \sim N(0,v)$. In particular, $$\tilde{X}_n -a \to 0 \quad \text{almost surely}.$$ The differentiability of $f$ at $a$ now implies that $$\sqrt{n} (f(\tilde{X}_n)-f(a)) = \underbrace{\sqrt{n} (\tilde{X}_n-a)}_{\xrightarrow[]{n \to \infty} G} \underbrace{\frac{f((\tilde{X}_n-a)+a)-f(a)}{\tilde{X}_n-a}}_{\xrightarrow[]{n \to \infty} f'(a)} \xrightarrow[]{n \to \infty} f'(a) G$$ almost surely. As $f'(a) G \sim N(0,1)$ and $X_n \stackrel{d}{=} \tilde{X}_n$, this proves the assertion.