Dini's theorem failing at a single point

30 Views Asked by At

Suppose we have a sequence of continuous functions $f_k: [0, 1] \to [0, 1]$ such that $f_{k + 1}(\beta) \leq f_k(\beta)$ for all $\beta$ (i.e monotone decreasing), and a continuous function $f: [0, 1] \to [0, 1]$. If $f_k$ converges pointwise to $f$ for all $\beta \neq 0$ is it true that

  1. $f_k$ converges pointwise to $f$ for all $\beta$?
  2. $f_k$ converges uniformly to $f$?

If $f_k$ converges pointwise to $f$ at $\beta = 0$ then this is a simple application of Dini's theorem, but does it work when a single point is missing?

EDIT: A counterexample was pointed out in the comments for the general case. Would $f$ or $f_k$ being Lipschitz continuous or of bounded variation be sufficient however?