Monotonicity of sequence of functions - Dini's theorem

109 Views Asked by At

To apply Dini's theorem (https://en.wikipedia.org/wiki/Dini%27s_theorem) we must have monotonicity, i.e. either $f_k(x)\leq f_{k+1}(x)$ or $f_k(x)\geq f_{k+1}(x)$ for all $x$ and from a $k$ onwards.


Let's assume we have a nice sequence of functions such that we are able to interpret it as a two dimensional function $f(x,k)$ which is differentiable w.r.t. $k$. Think of something like $f_k(x)= \frac{1}{k}+x.$

Is it legit to show the required monotonicity by means of the derivative $D_kf(x,k)$?

In the case of $f_k(x)= \frac{1}{k}+x$, we get $D_kf(x,k)=-\frac{1}{k^2}<0$. So for an arbitrary but fixed $x$ incrementing the index of the sequence of functions decreases $f_k(x)$. So we have monotonicity.

I see nothing wrong with that. However, it seems strange because it would save a lot of work.