Let $(Y_n)_{n\in \mathbb{N}}$ be independent and identically distributed random variables. Let $A_n=\sum_{p=1}^nY_p\quad \forall n\in \mathbb{N}$.
If $\varphi_{Y_1}$ is differentiable in $0$, show that there exists an $z\in \mathbb{R}$ such that $$\frac{A_n}{n}\rightarrow z$$ in probability for $n\rightarrow \infty$.
I know that in order to converge in probability the following most hold: $$\lim_{n\rightarrow \infty}P(|Y_n-Y|>\epsilon)=0$$
for all $\epsilon >0$. I somehow need to use the fact that $\varphi_{Y_1}$ is differentiable in $0$, but I dont know where to start.
In order to show $A_n/n\to z$ in probability, it suffices to show $A_n/n\to z$ in distribution (prove this!), which entails showing the characteristic function for $A_n/n$ converges to $e^{izt}$ (by Lévy's continuity theorem).
The characteristic function for $A_n/n$ is $\varphi(t/n)^n$. Now
$$ \lim_{n\to\infty}\varphi(t/n)^n = \exp(\lim_{n\to\infty} \log \varphi(t/n)^n), $$ and $$ \lim_{n\to\infty}\log \varphi(t/n)^n = t\cdot \lim_{n\to\infty}\frac{\log\phi(t/n)}{t/n} $$
The last limit exists, and equals derivative of $\log \phi(t)$ at $t=0$.