Prove or disprove: If $f(x)$ is continuous in $(0,1]$ and $f(x)\to\infty$ as $x\to 0^+$, then $\lim_{n\to\infty}\sum_{k=1}^n f(k/n)$ does not exist.

913 Views Asked by At

I'm trying to prove or disprove the following conjecture:

If $f(x)$ is continuous in $(0,1]$ and $f(x)\to\infty$ as $x\to 0^+$ then $L=\lim\limits_{n\to\infty}\sum\limits_{k=1}^n f\left(\frac{k}{n}\right)$ does not exist.

My attempt

I tried proof by contradiction. Asssume $L$ exists.

$L=\lim\limits_{n\to\infty}n(\frac{1}{n})\sum\limits_{k=1}^n f\left(\frac{k} {n}\right)=\left(\lim\limits_{n\to\infty}n\right)\int_{0}^1 f(x)dx$

(EDIT: As mentioned by @FShrike in the comments, the previous step is not valid.)

$\therefore \int_{0}^1 f(x)dx=0$

There are functions $f(x)$, continuous in $(0,1]$, such that $f(x)\to\infty$ as $x\to 0^+$ and $\int_{0}^1 f(x)dx=0$ . For example, $f(x)=-\ln{x}-1$, in which case $L$ does not exist, by Stirling's approximation.

But I do not see why all such functions $f(x)$ would imply that $L$ does not exist.

Context:

I am interested in geometrical infinite products (example1, example2). The conjecture in this question, via the substitution $f(x)=-\ln{g(x)}$, is equivalent to: If $g(x)$ is continuous in $(0,1]$ and $\lim\limits_{x\to 0^+}g(x)=0$ then $\lim\limits_{n\to\infty}\prod\limits_{k=1}^ng\left(\frac{k}{n}\right)$ either equals $0$ or does not exist, which stands in interesting contrast with the fact that infinite products of lengths or areas, that tend to $0$, can equal a positive number.

EDIT2:

I'm not sure if this is helpful, but I have noticed that $L_2=\lim\limits_{n\to\infty}\sum\limits_{k=1}^n f\left(\frac{k-1/2}{n}\right)$ can exist.

For example, $\lim\limits_{n\to\infty}\sum\limits_{k=1}^n \left(-\ln{\left(\frac{k-1/2}{n}\right)}-1\right)=-\frac{\ln{2}}{2}$. (Another question of mine yielded methods for dealing with the sum $\sum\limits_{k=1}^n \ln{(k-\frac12)}$.)

I do not understand why replacing $k$ with $k-\frac12$ seems to make the limit existable (if that's a word).

7

There are 7 best solutions below

0
On

This is more of a comment but it highlights the fallacy of claiming that OP implies $L =\infty$ which is simply not true. Consider $f(x)=x^{-1/2}, 0 < x \le 1/16$ and then $f(x)$ linear going from $4$ to $-100$ on $[1/16, 1/8]$ (so $-1664x+108$) and $f(x)=-100, 1/8 \le x \le 1$.

Let $n=16m$ then $\sum_{k=1}^{16m}f(k/n)=\sum_{k=1}^{m}\sqrt {16m/k}+\sum_{k=m+1}^{2m}(-104k/m+108)+\sum_{k=2m+1}^{8m}(-100)$

But $\sum_{k=1}^{m}1/\sqrt k \le 1+\int_1^{m}dx/\sqrt x=2\sqrt m -1$ so $\sum_{k=1}^{m}\sqrt {16m/k} \le 8m$

$\sum_{k=m+1}^{2m}(-104k/m+108)=-104(2m+1)+104(m+1)/2+108m=-48m-52$

$\sum_{k=2m+1}^{8m}(-100)=-600m$ so it is obvious that the original sum is highly negative and indeed goes to $-\infty$

More generally it is not hard to show that if $f$ monotonic on $[0,1]$ (the above example is so) and $\int_0^1f(x)dx$ finite (integral being the Lebesgue one or if you want $\lim_{\epsilon \to 0}\int_{\epsilon}^1f(x)dx$ exists in our case with the latter integral being Riemann) then $$\frac{\sum_{k=1}^nf(k/n)}{n} \to \int_0^1f(x)dx$$ which immediately shows that if the integral is negative and the function decreasing, the sums in the OP go to infinity

I see nothing to suggest that we may not get examples where things balance nicely and the limit is finite or at least the sums are bounded.

7
On

This is more of a long comment than a solution. Since $f$ is continuous and $\lim_{x\rightarrow0+}f(x)=\infty$, $I=\int^1_0f=\infty$ or $I=\int^1_0f$ is finite. I propose to split the interval $(0,1]$ in two pieces so that on $(0,a]$, $f>1$, and $\int^1_af$ has the same sign as $I$ (or $<0$ if $I=0$).

The sum $\sum^n_{k=\lfloor na\rfloor +1}f(k/n)\sim n\int^1_af$ in the sense that $b_n:=\frac{\sum^n_{k=\lfloor na\rfloor +1}f(k/n)}{n\int^1_a}\xrightarrow{n\rightarrow\infty}1$. We then consider $$R_n:=\frac{\sum^n_{k=1}f(k/n)}{n\int^1_af}=\frac{\tfrac1n\sum^{\lfloor na\rfloor}_{k=1}f(k/n)}{\int^a_1f}+b_n$$

All members in the sum $\sum^{\lfloor na\rfloor}_{k=1}f(k/n)$ are positive.

If $f$ is monotone nonincreasing in $(0,a]$ and integrable, then $$R_n\xrightarrow{n\rightarrow\infty}\frac{\int^a_0f}{\int^1_af}+1=\frac{\int^1_0f}{\int^1_af}=R$$ If $I\neq0$, then $$S_n:=\sum^n_{k=1}f(k/m)\xrightarrow{n\rightarrow\infty}\operatorname{sign}(I)\cdot\infty$$

If $f$ is monotone nonincreasing in $(0,a]$ and $I=\infty$, then $$R_n\xrightarrow{n\rightarrow}\operatorname{sign}\Big(\int^1_af\big)\cdot\infty$$ and so, $S_n\xrightarrow{n\rightarrow\infty}\infty$

This suggests that the case $\int^1_0f=0$ is where the interesting stuff happens.

I also think that understanding the case where $f$ is monotone non increasing will yield more information even in the general case for we can consider the monotone envelops \begin{align} \alpha(x)=\inf\{f(t): 0<t\leq x\}\qquad \beta(x)=\sup\{f(t): x\leq t\leq 1\} \end{align} $\alpha$ and $\beta$ are both monotone non increasing and $$\alpha\leq f\leq \beta$$

4
On

First consider $g(x)$ such that $g(x)=1$ on $(2^{-3},2^{-1}]$, $g(x)=2$ on $(2^{-5}, 2^{-3}]$, $g(x) = 4$ on $(2^{-7}, 2^{-5}]$, etc. Taking them from rightmost on the number line first and going to the left, these intervals fill up the space from $0.5$ to $0$ and each one is one-fourth the size of the previous one. So in the limit, each one will have one-fourth the points as the one before, and twice the value, for double the total value. Thus, the total value over all of them will tend towards $2$ raised to the number of intervals. But the number of intervals with at least one value will tend towards $\log_2(n)$, to the total value will tend towards a linear function of $n$. If we give the interval from $0.5$ to $1$ a constant value, then the total of values in that interval will also tend towards a linear function of $n$. So we just have those two functions cancel out to some finite value.

Now, this does have the issue that $g(x)$ is not continuous, but I think that can be dealt with. We can define $f$ to be “close” to $g$, and have it go from the value of one interval to the next in exponentially decreasing time.

0
On

I would like to further elaborate on TravorLZH's answer.

Under the assumptions in that answer (i.e., $f(x)$ monotonic and continuously differentiable) we have that

$$\sum_{k=1}^nf\left(\frac kn\right)=n\int_{1/n}^1f(t)\mathrm dt+\frac12f\left(\frac1n\right)+\frac12f(1)+O\left(-{f'(1/n)\over n}\right)+O\left(\frac1n\right).$$

Thus if we further assume that $$\lim_{x\to 0^+} x f'(x) = L_1 < \infty$$ The LHS converges iff $$\lim_{x\to 0^+}\frac1x \int_x^1 f(t) dt +\frac12 f(x) = L_2 <\infty.$$

If $F(x)$ is a primitive of $f(x)$ we have $$g(x) = \frac1x \left[F(1)-F(x)\right] + \frac12 F'(x)$$ with $\lim_{x\to 0^+} g(x) = L_2$. The solutions to this differential equation are \begin{eqnarray} F(x) &=& -x^2 \int_1^x \left(-\frac{2F(1)}{t^3}+\frac{g(t)}t\right)dt + C_1x^2=\\ &=&F(1) + x^2\int_1^x \frac{g(t)}{t^2}dt + C_2x^2. \end{eqnarray} Differentiating yields $$f(x) = 2x\int_1^x \frac{g(t)}{t^2} dt + g(x) +2C_2x.$$ Since $\lim_{x\to 0^+}g(x)=L_2$, there exists $$G = \sup_{x\in(0,1]}g(x).$$ Also $$\lim_{x\to 0^+} f(x) = -\lim_{x\to 0^+}x^2 \int_x^1 \frac{g(t)}{t^2}dt+L_2.$$

Fix $\varepsilon > 0$ and select $\delta > 0$ in such a way that $|g(x)| < |L_2| + \varepsilon$, for $0< x<\delta$, then we have \begin{eqnarray} |f(x)| &<& x^2 \int_x^\delta \frac{|L_2|+\varepsilon}{t^2} dt+ \delta^2 (1-\delta)\frac{G}{\delta^2} + |L_2|\leq\\ &<&2 x^2\frac{|L_2|+\varepsilon}{x} + (1-\delta)G + |L_2|< \\ &<& 2\delta(|L_2|+\varepsilon) + (1-\delta)G + |L_2| \end{eqnarray} But this contradicts the hypothesis that $\lim_{x\to 0^+} f(x) = +\infty$.

In conclusion, if $f(x)$ is monotonic and continuously differentiable, the sum in OP cannot converge if $$\lim_{x\to 0^+} xf'(x) =L_1 < \infty$$

4
On

This answer discusses the situation when we restrict the consideration to the situation when $f$ is decreasing, $f'$ is monotonic and continuously differentiable.

Applying Euler-Maclaurin formula, we get

$$ \sum_{k=1}^nf\left(\frac kn\right)=n\int_{1/n}^1f(t)\mathrm dt+\frac12f\left(\frac1n\right)+\frac12f(1)+\frac1n\int_1^n\overline B_1(t)f'\left(\frac tn\right)\mathrm dt, $$

where $\overline B_1(t)=t-\lfloor t\rfloor-\frac12$ is the first Bernoulli function.

Integration by parts on the rightmost integral gives

$$ \int_1^n\rho(t)f'\left(\frac tn\right)\mathrm dt=\left.\frac12\overline B_2(t)f'\left(\frac tn\right)\right|_1^n-{1\over2n}\int_1^n\overline B_2(t)f''\left(\frac tn\right)\mathrm dt=O\left\{-f'\left(\frac1n\right)\right\}+O(1) $$

Combining everything, we have

$$ \sum_{k=1}^nf\left(\frac kn\right)=n\int_{1/n}^1f(t)\mathrm dt+\frac12f\left(\frac1n\right)+\frac12f(1)+O\left(-{f'(1/n)\over n}\right)+O\left(\frac1n\right).\tag1 $$

Now, let $f(x)=-\ln x-1$, so that it satisfies the conditions for (1). Calculation gives

$$ \int_{1/n}^1f(t)\mathrm dt=-x\ln x|_{1/n}^1=\frac1n\ln\frac1n,\quad -f'\left(\frac1n\right)=\frac1n, $$

so (1) becomes

$$ \sum_{k=1}^nf\left(\frac kn\right)=\ln\frac1n+\frac12\left(-\ln\frac1n-1\right)+O\left(n\over n\right)+O\left(\frac1n\right)=\frac12\ln\frac1n+O(1), $$

which indicates that the limit does not necessarily exist even when $\int_0^1f(t)\mathrm dt=0$.

4
On

If we assume that $f(x)$ is decreasing, here is an argument that $L=\lim\limits_{n\to\infty}\sum\limits_{k=1}^n f\left(\frac{k}{n}\right)$ does not exist.

enter image description here

The graph shows an example with $n=3$. (The curve does not have to be above the x-axis.)

$\begin{align} \frac{1}{2n}\sum\limits_{k=1}^{2n}f\left(\frac{k}{2n}\right) & =\frac{1}{n}\sum\limits_{k=1}^{n}f\left(\frac{k}{n}\right)+\color{red}{(\text{area of red region})}+\color{blue}{(\text{total area of blue regions})}\\ & =\frac{1}{n}\sum\limits_{k=1}^{n}f\left(\frac{k}{n}\right)+\color{red}{\frac{1}{2n}\left(f\left(\frac{1}{2n}\right)-f\left(\frac{1}{n}\right)\right)}+\color{blue}{\frac{1}{2n}\left(\frac{1}{2}\left(f\left(\frac{1}{n}\right)-f(1)\right)\right)+O\left(\frac{1}{n}\right)} \end{align}$

Multiply by $2n$ and rearrange.

$2\sum\limits_{k=1}^{n}f\left(\frac{k}{n}\right) - \sum\limits_{k=1}^{2n}f\left(\frac{k}{2n}\right) =\color{brown}{\frac{1}{2}f\left(\frac{1}{n}\right) - f\left(\frac{1}{2n}\right)}+\frac{1}{2}f(1)+c+O\left(\frac{1}{n^2}\right)$

$f(x)\to\infty$ as $x\to0^+ \implies \color{brown}{\frac{1}{2}f\left(\frac{1}{n}\right) - f\left(\frac{1}{2n}\right)} \le -\frac{1}{2}f\left(\frac{1}{2n}\right)$

$\therefore 2\sum\limits_{k=1}^{n}f\left(\frac{k}{n}\right) - \sum\limits_{k=1}^{2n}f\left(\frac{k}{2n}\right) \le -\frac{1}{2}f\left(\frac{1}{2n}\right)+\frac{1}{2}f(1)+c+O\left(\frac{1}{n^2}\right)$

If $L=\lim\limits_{n\to\infty}\sum\limits_{k=1}^n f\left(\frac{k}{n}\right)$ exists, then as $n\to\infty$, LHS $\to L$ but RHS $\to-\infty$. Contradiction.

$\therefore L$ does not exist.

0
On

The statement in the title of this question can be disproved. I posted an equivalent question on Math Overflow, and it has been answered.

Note: The $f(x)$ in this question, and the $f(x)$ in the Overflow question, are different. Here is how they are related:

$$[f(x)\text{ in this question}]=-\ln{[f(x)\text{ in the Math Overflow question}]}$$