Proving uniform convergence on an unbounded interval

472 Views Asked by At

Prove that the following series $$\sum_{n=1}^{\infty}\frac{nx}{1+n^2\log^2(n)x^2}$$ converges uniformly on $[\epsilon,\infty)$ for any $\epsilon>0.$

What I have done:

The function $$f_n(x)=\frac{nx}{1+n^2\log^2(n)x^2}$$ is a decreasing function on $[1,\infty)$. So on $[1,\infty)$ its maximun value is $$f_n(1)=\frac{n}{1+n^2\log^2(n)}\leq\frac{n}{n+n^2\log^2(n)}=\frac{1}{1+n\log^2(n)}=M_n.$$

By Cauchy Condensation Test, $\sum M_n$ is convergent. Therefore by the Weierstrass-M Test $\sum f_n(x)$ is convergent on $[1,\infty)$.

I am not entirely sure about the following statement. Please correct me if I am wrong:

If we can prove that the $\sum f_n(x)$ is uniformly convergent on both $[\epsilon,1]$ and $[1,\infty)$, then it is uniformly convergent on $[\epsilon, \infty)$.

But I don't know how to prove that $\sum f_n(x)$ is uniformly convergent on $[\epsilon,1]$. I need some help.

3

There are 3 best solutions below

6
On BEST ANSWER

We have that $$\left(\frac{t}{1+\log^2(n)t^2}\right)'=\frac{1-\log^2(n)t^2}{(1+\log^2(n)t^2)^2}$$ and therefore the positive function $t\to \frac{t}{1+\log^2(n)t^2}$ is increasing in $[0,\frac{1}{\log(n)}]$, and decreasing $[\frac{1}{\log(n)}, +\infty)$. Hence, by letting $t=nx$, then $\frac{1}{\log(n)}<n\epsilon$ for sufficiently large $n$, and $$\sup_{x\in [\epsilon,+\infty)}\frac{nx}{1+n^2\log^2(n)x^2}= \sup_{t\in [n\epsilon,+\infty)}\frac{t}{1+\log^2(n)t^2}=\frac{n\epsilon}{1+\log^2(n)n^2\epsilon^2}\leq\frac{1}{\epsilon n\log^2(n)}$$ where the series $\sum \frac{1}{n\log^2(n)}$ is convergent (compare with the integral of $\frac{1}{x\log^2(x)}$). So now you can use the Weierstrass-M Test.

0
On

The function $f_n$ attains its maximum when $x=\frac1{n\log n}$, if $\frac {1}{n\log n}<\varepsilon$; its value there is $\frac1{\log n}$. Of course, if $n$ is large enough, $\frac{1}{n\log n}\leqslant\varepsilon$ and so the maximum of $f_n$ will be then $f_n(\varepsilon)$. But$$f_n(\varepsilon)=\frac{n\varepsilon}{1+n^2\log^2(n)\varepsilon^2}<\frac1{n\log^2(n)\varepsilon}.$$As you wrote, we can use the Cauchy condensation test here.

0
On

The statement you mention is correct: the uniform convergence of a sequence $\left(f_n\right)_{n\geqslant 1}$ to $f$ on a set $A$ means that $\sup_{x\in A}\left\lvert f_n(x)-f(x)\right\rvert$ goes to zero as $n$ goes to infinity. If the sequence is also uniformly convergent to $f$ on $B$, then use the fact that $\sup_{A\cup B}=\max\left\{\sup_A,\sup_B\right\}$ to get the uniform convergence on $A\cup B$. Since the uniform convergence of a series is the same as the uniform convergence of the partial sums, the argument also applies to the convergence of series.

Now back to the initial problem. If $\varepsilon\leqslant x\leqslant 1$, then $$\left\lvert \frac{nx}{1+n^2\log^2(n)x^2}\right\rvert \leqslant\frac{n}{1+n^2\log^2(n)\varepsilon^2}, $$
and the right hand side is the general term of a convergent series.