As the title says, I'd like to understand whether the following proof of the well known fact that give $f \in \Delta([m],[n])$ weakly increasing is uniquely determined by being $f = \delta^{i_1}\circ \cdots \delta^{i_r}\circ \sigma^{j_{1}}\circ \dots \circ \sigma^{j_{s}}$.
The sketch of the proof is the following: define $k_0 := \min (k \in [m] : f(k) \ne k)$. then we have case $(i)$ if $k_0 < f(k_0)$ or case $(ii)$ if $k_0 > f(k_0)$. The idea is that in case one $f = g \circ \delta^{k_0}$ where $g$ has "a $k_0$ greater than the one of $f$" and case $(ii)$ fits into case one once noticed that in case $(ii)$ $f = h \circ \sigma^{k_0-1}$.
The idea is very okay, my problem is understanding why for case $(ii)$ it should be true that $h$ fits case $(i)$. An explicit definition of $h$ is the following $$k \to \begin{cases}k & k < k_0 \\ f(k+1) & k \geq k_0 \end{cases}$$ It follows that $h$ fits case $(i)$ is equivalent to $k_0 < f(k_0 + 1)$ which I don't see why it should hold, since I can simply send everything strictly greater than $k_0$ to $k_{0}-1$ and everything seems to work to me.
The definition of $\delta$ and $\sigma$ are $$\delta^{i}(k) := \begin{cases}k & k < i \\ k+1 & k \geq i \end{cases}$$
$$\sigma^{i}(k) := \begin{cases}k & k \leq i \\ k-1 & k > i \end{cases}$$
Any help or tip understanding whether this proof of this proposition is true/could be fixed is appreciated, thanks in advance.