Showing Uniform Convergence fails on a set

18 Views Asked by At

I would like to show that the sequence of functions $\langle f_n \rangle$ defined by

$$f_n(x)=\frac{nx}{nx+1}$$

fails to converge uniformly on the set $[0, \infty)$. I understand here that the tricky point here is $0$; convergence here is "slow," so to speak, but I'm not sure how to show that it fails to converge uniformly. I know that if I want to show it fails to converge uniformly to some function $f$, I must show

$$\exists \varepsilon > 0 \forall N \in \mathbb{N} \exists n \geq N \exists x \in E: |f_n(x)-f(x)| \geq \varepsilon $$

The difficult part is picking which epsilon. I am convinced that I should do something involving the Squeeze Theorem and attempt to arrive at a contradiction, so I would say something like:

Since $x \in [0, \infty),$ then for any $M \geq x$, we have

$$0 \leq x \leq M$$

$$f_n(0) \leq f_n(x) \leq f_n(M)$$

$$\frac{n \cdot 0}{n \cdot 0 +1} \leq \frac{nx}{nx+1} \leq \frac{Mx}{Mx+1}$$

How should I proceed? I seek to find a contradiction using some $\varepsilon > 0$. Any advice?