Use Slutsky's theorem to show that: $\sqrt{n}(e^{\frac{S_n}{n}}-e^{\mu}) \xrightarrow{d} \sigma e^{\mu}Z$

250 Views Asked by At

Let $\{X_n\}_{n\ge1}$ be a sequence of i.i.d. random variables with common mean $\mu$ and variance $\sigma^2 \in (0,\infty)$. Use Slutsky's theorem to show that: \begin{align} \sqrt{n}(e^{\frac{S_n}{n}}-e^{\mu}) \xrightarrow{d} \sigma e^{\mu}Z \end{align} where $S_n=\sum\limits_{k=1}^{n}X_k$ and $Z\in N(0,1)$.

I have used Slutsky's theorem in plenty of problems before but I cannot make any progress on this one, any help here would be greatly appreciated.

2

There are 2 best solutions below

1
On BEST ANSWER

Define $$ G(t)=\begin{cases}\dfrac{e^t-\mu}{t-\mu}, & t\neq \mu,\cr e^\mu, & t=\mu.\end{cases} $$ This function is continuous everywhere since $\lim\limits_{t\to\mu}\dfrac{e^t-\mu}{t-\mu}=(e^t)'_\mu=e^\mu$. By LLN, $\frac{S_n}{n}\xrightarrow{p}\mu$, and by continuous mapping theorem, $$ G\left(\frac{S_n}{n}\right)\xrightarrow{p}G(\mu)=e^\mu. $$ Then $$ \sqrt{n}\left(e^{\frac{S_n}{n}}-e^{\mu}\right)=\sqrt{n}\left(\frac{S_n}n-\mu\right)\cdot G\left(\frac{S_n}{n}\right). $$ Here the first term $\sqrt{n}\left(\frac{S_n}n-\mu\right)\xrightarrow{d}\sigma Z$ by CLT, where $Z\sim N(0,1)$. And the second term converges in probability to $e^\mu$. Slutsky's theorem implies that the product converges in distribution to $\sigma e^\mu Z$.

Note that this is almost the same reasoning as in the previous answer, and indeed the same as the so-called delta method.

1
On

this is not a direct solution to your problem, but this situation seems a candidate for an application of the delta method. Here is the result. I think filling the details for your situation is straight forward:

Theorem: Let $X_n$, $n\in\mathbb{N}$, and $Y$ be random vectors in $\mathbb{R}^m$. Suppose $a_n\xrightarrow{n\rightarrow\infty}\infty$ and $a_n(X_n-c)\stackrel{n\rightarrow\infty}{\Longrightarrow} Y$. If $g$ is a differentiable function, then \begin{aligned} a_n(g(X_n)-g(c))\stackrel{n\rightarrow\infty}{\Longrightarrow} g'(c)Y \end{aligned}

You can find this result in many statistics books (Shao's Mathematical Statistics for example) The proof combines a first order Taylor approximation of $g(X_n)$ and makes use of Slutky's lemma.

Here is a sketch of the proof: Using a first order Taylor expansion we obtain \begin{equation} g(x)-g(c)-g'(c)(x-c)=o(x-c) \end{equation} where $|o(x-c)|/|x-c|\rightarrow0$ as $x\rightarrow c$. By Slutsky's theorem and convergence in measure, it is enough to show that $a_no(X_n-c)$ converges in law, an hence in measure, to $0$. Since $a_n(X_n-c)$ converges in law and $a_n\rightarrow\infty$. Slutsky's theorem and convergence in measure arguments show that $X_n-c=\frac{1}{a_n}a_n(X_n-c)$ converges in law, and hence in measure, to $0$. For any $\varepsilon>0$, there is $\delta>0$ such that $|x-c|<\delta$ implies that $|o(x-c)|<\varepsilon|x-c|$. Hence, \begin{aligned} \Pr[|o(X_n-c)|/|X_n-c|\geq\varepsilon]\leq\Pr[|X_n-c|\geq\delta]\rightarrow0 \end{aligned} as $n\rightarrow\infty$, and so $o(X_n-c)/|X_n-c|$ converges in measure to $0$. As $a_n(X_n-c)$ converges in law, by Slutsky's theorem $|a_no(X_n-c)|=|a_n(X_n-c)|\frac{|o(X_n-c)|}{|X_n-c|}$ converges in law, and hence in measure, to $0$.