Convergence of sequence implies convergence of supremum of sequence

14 Views Asked by At

This is part of the proof for the Donsker Theorem form Mörters Brownian Motion. There, the authors state, that for any sequence $(a_n)_{n\geq 1}$ with $$\lim_{n\rightarrow\infty}\frac{a_n}{n}=1$$ it is implied that

$$\lim_{n\rightarrow\infty}\sup_{0\leq k\leq n}\left|\frac{a_k-k}{n}\right|=0.$$

Intuitively, it seems clear. If $a_n/n$ converges to $1$, than the the absolute of the biggest deviation of $a_n$ from the diagonal n should be going to zero. But i have difficulties to see, how one would proof that formally. Can somebody give me a hint?