Inequality $f(n)\sum_{k=n}^{\infty}\frac{1}{f(k)}\le Cn^2$ for increasing function $f$

70 Views Asked by At

Assume $f$ is increasing and $\sum_n 1/f(n)<\infty$. I am trying to prove that $$f(n)\sum_{k=n}^{\infty}\frac{1}{f(k)}\le C n^2$$ for some positive constant $C$ (I believe that $C$ could be arbitrarily small and instead of $n^2$ in RHS we can replaced with $n^{1+\epsilon}$ ). This is equivalent to $$\limsup_{n\to\infty}n^{-2} f(n)\sum_{k=n}^{\infty}\frac{1}{f(k)}<\infty$$

One could bring this inequality into the inegral form $$ \limsup_{x\to\infty}x^{-2} f(x)\int_{x}^{\infty}\frac{dy}{f(y)}<\infty$$ I tried with L'Hopital rule but it does not really works.


3

There are 3 best solutions below

1
On BEST ANSWER

Surprisingly, not only is this conjecture false, but it remains false if $Cn^2$ is replaced by any function $B(n)$ of $n$, no matter how fast it grows!

The idea is to use a sequence of values $(f(n))_{n=1}^\infty$ that is constant for very long stretches before switching to a new value. (Indeed, I imagine the OP tested the conjecture against standard smooth functions like powers of $n$, exponential functions, and things like $f(n) = n(\log n)^2$; in general, constant-for-long-stretches functions are a really good complement to "normal" functions for testing convergence-related conjectures.)

For simplicity let's assume that $B(n)$ is positive-integer-valued. Define a sequence of cutoff points $x_j$ and a sequence of values $v_j$ recursively as follows:

  • $x_1=1$;
  • Given $x_j$, set $v_j=\dfrac1{2^jB(x_j)}$ and $x_{j+1} = x_j + B(x_j)$.
  • Then define $f(n) = \dfrac1{v_j}$ where $j$ is the unique positive integer such that $x_j \le n < x_{j+1}$.

In other words, the sequence $\biggl(\dfrac1{f(n)}\biggr)_{n=1}^\infty$ consists of $B(x_1)$ copies of $v_1$, then $B(x_2)$ copies of $v_2$, and so on. In particular, $$ \sum_{n=1}^\infty \frac1{f(n)} = \sum_{j=1}^\infty v_j B(x_j) = \sum_{j=1}^\infty \frac1{2^j} = 1. $$ On the other hand, $$ f(x_j) \sum_{k=x_j}^\infty \frac1{f(k)} > f(x_j) \sum_{k=x_j}^{x_{j+1}-1} \frac1{f(k)} = \sum_{k=x_j}^{x_{j+1}-1} 1 = x_{j+1}-x_j = B(x_j); $$ in particular, $\displaystyle f(n) \sum_{k=n}^\infty \frac1{f(k)} > B(n)$ for infinitely many $n$.

0
On

You are going to need an additional condition on $f,$ like maybe convexity.

Let $n_1=1, n_{i+1}=n_i^3+n_i.$

Define $$f(n)=2^{n_{i+1}}, \quad n_i\leq n<n_{i+1}$$

Then since $f(n)>2^n$ for all $n,$ the series $\sum_n \frac1{f(n)}$ converges.

Now when $n=n_i,$ $$\sum_{k=n_i}^{\infty}\frac1{f(k)}\geq \sum_{k=n_i}^{n_{i+1}-1}\frac1{f(k)}=\frac{n_i^3}{f(n_i)}$$

Thus we have a counterexample, since we have infinitely many $n$ with:

$$n^{-2}f(n)\sum_{k=n}^{\infty} \frac1{f(k)}\geq n.$$

If you want $f$ strictly increasing, you can tweak this counterexample.

Essentially, you just need $f$ nearly constant for long runs.


I think convexity might be enough. So you’d need: $$f(n-1)+f(n+1)\geq 2f(n)$$ for all $n.$

I’ll work on that for a bit.

0
On

As other answers have falsified the statement, this is to heuristically illustrate why this holds for "common" functions $f$.

Let's say $f(n)$ can be extended to a smooth function $f(x)$ and use the asymptotic $\int_{x}^\infty\frac{dt}{f(t)} \sim \sum_{t=x}^\infty \frac{1}{f(t)}$. We may assume that $\frac{f(x)}{x^2}\rightarrow \infty$, then we compute the limit by L'H $$\frac{\int_x^\infty \frac{dt}{f(t)}}{\frac{x^2}{f(x)}} \sim \frac{-\frac{1}{f(x)}} {\frac{2xf(x)-x^2f'(x)}{f(x)^2}} = \frac{1}{x^2\frac{f'(x)}{f(x)} -2x}\xrightarrow {?} 0$$ For "common" functions, $x^2\frac{f'(x)}{f(x)} -2x$ is almost surely gonna go to infinity, unless it's identically $0$, but the solutions to the ODE $x^2\frac{f'(x)}{f(x)} -2x$ are just $f(x) = Cx^2$ which doesn't dominate $x^2$, hence unavailable for the purpose to make a counterexample.

This also suggests that the $C$ can be made arbitrarily small is plausible, as the limit is likely to be $0$. And the same hueristic works as well if you replace $n^2$ by $n^{1+\epsilon}$ and the same calculation yields $\frac{1}{x^\epsilon (\frac{f'(x)}{f(x)}x - (1+\epsilon))}$, which shows why $\epsilon>0$ is necessary: We do want the term $x^\epsilon$ to be there to push the limit to $\infty$ for most $f$.