Problem: Let $f:[0,1[ \to \mathbb{R}$ be a non-decreasing function such that $\int_0^1{f(x)dx}<+\infty $. Show that $$ \lim_{x\to 1^-}{(1-x)f(x)}=0.$$ Proof: $f(x)$ is a monotonic function so it admits a limit for $x \to 1^-$. We have: $$ \lim_{x\to 1^-}=\sup\{f(x): x\in [0,1[\}=l \in \mathbb{R} \cup\{+\infty\}.$$ If $l \in \mathbb{R}$ the thesis follows trivially, so we assume that $l=+\infty$. So we can find $a \in [0,1[$ such that $f(x) >0$ for all $x \in [a,1[$ and it follows that $$(1-x)f(x) \geq 0 \textrm{ for all } x \in [a,1[ \Rightarrow \lim_{x\to 1^-}{(1-x)f(x)}=m \geq 0.$$ Suppose by contradiction that $\mathbb{R} \ni m>0$. Then, it follows from the definition of limit that there exists $\delta >0$ such that: $$ (1-x)f(x)> \frac{m}{2} \textrm{ for all } x\in [1-\delta,1[ \Rightarrow f(x)> \frac{m}{2(1-x)} \textrm{ for all } x\in [1-\delta,1[ .$$ It is easy to see that the last result implies that $ \int_0^1{f(x)dx}=+\infty $ and we have a contradiction. The same situation occurs if we suppose that $m=+\infty$.
However I am aware that this proof may be incomplete because I had to exclude even the possibility that $\nexists \lim_{x\to 1^-}{(1-x)f(x)}.$
So can someone give me some hints in order to improve my proof? Thank you in advance!
By adding the constant $f(0)$ to $f$, we can assume that $f\ge 0$. Then $(1-x)f(x)\ge 0$, so if the limit wasn't equal to zero, then $(1-x_n)f(x_n)\ge\epsilon>0$ for a certain increasing sequence $x_n\to 1$. But then $$ \int_{x_{k-1}}^{x_k} f(x)\, dx \gtrsim \frac{x_k-x_{k-1}}{1-x_{k-1}} . $$ Here we can also pass to a subsequence, and then, if $x_k$ approaches $1$ sufficiently fast, we will have the bound $1-2^{-k}$. This contradicts our assumption that $\int f<\infty$.