Prove that if $\lim_{x\to 0} \frac{f(x)}{x} = 1$, then the sequence $a_n = \left(\sum_{k=1}^n f\!\left(\frac{1}{k}\right)\right) - \ln n$ converges

48 Views Asked by At

Let $f: (-c, c) \to \mathbb{R}$ where $c$ is some possitive real number, and $\lim_{x\to 0} \frac{f(x)}{x} = 1$. I need to prove that the sequence $(a_n)_{n\in\mathbb{N}},\ a_n = \left(\sum_{k=1}^n f\!\left(\frac{1}{k}\right)\right) - \ln n$ converges.

Now, this exercise is in a highschool level textbook and we don't do asymptotic analysis in highschool where I live, but I feel that it might be really easy using little-o notation. The problem is that I know very little asymptotic analysis (I know what each Landau notation means and basic properties of them) and I tried to prove it but I don't know how to continue and your help would be appreciated.

Firstly, I note that: $$\lim_{x\to 0} \frac{f(x)}{x} = 1 \iff f(x) = x + o(x),\ x\to 0$$ And then I reverse the order of summation so that I can see that: $$\begin{align*}&\sum_{k=1}^n f\!\left(\frac{1}{k}\right) = \sum_{k=1}^n f\!\left(\frac{1}{n-k+1}\right) \\ &\frac{1}{n-k+1} \to 0,\ n \to \infty \\ &f\!\left(\frac{1}{n-k+1}\right) = \frac{1}{n-k+1} + o\!\left(\frac{1}{n-k+1}\right),\ n\to\infty \end{align*}$$ So the sequence $a_n$ is $$\begin{align*} a_n &= \left(\sum_{k=1}^{n} f\!\left(\frac{1}{n-k+1}\right)\right) - \ln n \\ &= \left(\sum_{k=1}^n \frac{1}{n-k+1} + o\!\left(\frac{1}{n-k+1}\right)\right)-\ln n \\ &= \left(\sum_{k=1}^n \frac{1}{n-k+1}\right) + o\!\left(\sum_{k=1}^n \frac{1}{n-k+1}\right) - \ln n \\ &= H_n - \ln n + o(H_n) \end{align*} $$

And now I don't know how to continue. I know that $H_n - \ln n$ is convergent and its limit is $\gamma$, the Euler-Mascheroni constant, and also I know that $o(H_n) \subseteq o(n)$ (I think)

1

There are 1 best solutions below

4
On

That is actually not true due to a lack of uniformity. For instance, by considering $$ f(x) = x+\frac{x}{\log\left(10+\frac{1}{x^2}\right)}$$ over $(-1,1)$ (with $f(0)$ being defined as zero) we have $$ f\left(\tfrac{1}{n}\right) = \frac{1}{n}+\frac{1}{n\log(10+n^2)}$$ $$ a_n = (H_n-\log n)+\sum_{k=1}^{n}\frac{1}{k\log(10+k^2)} $$ where $H_n-\log n$ does converge to $\gamma$ but $\sum_{k\geq 1}\frac{1}{k\log(10+k^2)}$ is divergent.
Probably some hypothesis on $f$ is missing, like its differentiability or Holder-continuity at the origin.