Study the convergence of a succession of functions

872 Views Asked by At

Study the punctual and uniform convergence of $f_n(x)$ on $A$

$$f_n(x)=\frac{x}{1+n^2 x^2} \ \ \ A=[-1,1]$$

My reasoning:

Punctual convergence

$\forall x \in A $ $$ \lim_{n \to +\infty} f_n(x)=f(x) \\ f\equiv0$$

Uniform convergence

It needs this propriety: $ f_{n+1}(x) \le f_n(x) $ (Dini's theorem hypothesis)

So, the succession converges uniformly on $[0,1]$

Is this reasoning correct?

2

There are 2 best solutions below

0
On BEST ANSWER

For an alternate proof, note that $f_n'(x)=0$ implies $x=\pm\frac1n$, and \begin{align} f_n\left(\frac1n\right) &= \frac1{2n}\\ f_n\left(-\frac1n\right) &= -\frac1{2n}\\ f(-1) &= -\frac1{1+n^2}\\ f(1) &= \frac1{1+n^2}. \end{align} It follows that $$\lim_{n\to\infty}\sup_{x\in[-1,1]}|f_n(x)|=0, $$ so that $f_n$ converges uniformly to $0$ on $[-1,1]$.

0
On

Yes, your reasoning is correct: a monotone sequence converging pointwise to a continuos function converges uniformly. In case you are interested in a proof, you can find one here. The same reasoning can be applied to prove uniform convergence in $[-1,0]$, where the sequence of functions is monotonically increasing to $0$.

Just a side note: it is not true that "it needs" that property, as you say. Indeed, you can prove prove uniform convergence directly from the definition. This can be done considering the derivative of $f_n$ and looking for where it attains the maximum distance from the limiting function. But derivatives never bothered you anyway.. :)