In the MSE-question in a comment to an naswer Michael Hardy brought up the following well known limit- expression for the Euler-gamma $$ \lim_{n \to \infty} \left(\sum_{k=1}^n \frac 1k\right) - \left(\int_{t=1}^n \frac 1t dt\right) = \gamma \tag 1$$
I've tried some variations, and heuristically I found for small integer $m \gt 1$ $$ \lim_{n \to \infty} (\sum_{k=1}^n \frac 1{k^m}) - (\int_{t=1}^n \frac 1{t^m} dt) = \zeta(m) - \frac 1{m-1} \tag 2$$
With more generalization to real $m$ it seems by Pari/GP that eq (1) can be seen as a limit for $m \to 1$ and the Euler-$\gamma$ can be seen as the result for the Stieltjes-power-series representation for $\zeta(1+x)$ whith the $\frac 1{1-(1+x)}$-term removed and then evaluated at $x=0$
Q1: Is there any intuitive explanation for this (or, for instance, a graphical demonstration)?
Another generalization gave heuristically also more funny hypotheses:
$$ \tag 3$$
$$ \small \begin{eqnarray}
\lim_{n \to \infty} (\sum_{k=2}^n \frac 1{k(k-1)}) &-& (\int_{t=2}^n \frac 1{t(t-1)} dt) &=& \frac 1{1!} \cdot(\frac 11 - 1\cdot \log(2)) \\
\lim_{n \to \infty} (\sum_{k=3}^n \frac 1{k(k-1)(k-2)}) &-& (\int_{t=3}^n \frac 1{t(t-1)(t-2)} dt) &=& \frac 1{2!} \cdot(\frac 12 - 2\cdot \log(2) + 1\cdot \log(3) ) \\
\lim_{n \to \infty} (\sum_{k=4}^n \frac 1{k...(k-3)}) &-& (\int_{t=4}^n \frac 1{t...(t-3)} dt) &=& \frac 1{3!} \cdot(\frac 13 - 3\cdot \log(2) + 3\cdot \log(3)- 1\cdot \log(4) ) \\
\end{eqnarray} $$
where the coefficients in the rhs are the binomial-coefficients and I think the scheme is obvious enough for continuation ad libitum.
Again it might be possible to express this with more limits: we could possibly write, for instance the rhs in the third row as
$$ \lim_{h\to 0} \frac 1{3!} \cdot(- \small \binom{3}{-1+h} \cdot \log(0+h) +1 \cdot \log(1) - 3\cdot \log(2) + 3\cdot \log(3)- 1\cdot \log(4) ) \tag 4$$
Q2: Is that (3) true and how to prove (if is it not too complicated...)? And is (4) somehow meaningful?
For Q1. the proof just relies on summation by parts.
For Q2., you can evaluate $$S_k = \sum_{n=1}^{+\infty}\frac{1}{n(n+1)\ldots(n+k)} = \frac{1}{k!}\sum_{n=1}^{+\infty}\frac{1}{n\binom{n+k}{k}}$$ by exploiting partial fractions decomposition and the residue theorem, or just the wonderful telescoping trick $\frac{1}{n(n+k)}=\frac{1}{k}\left(\frac{1}{n}-\frac{1}{n+k}\right)$, giving:
$$\begin{eqnarray*}S_k &=& \frac{1}{k}\left(\sum_{n=1}^{+\infty}\frac{1}{n(n+1)\ldots(n+k-1)}-\sum_{n=1}^{+\infty}\frac{1}{(n+1)(n+2)(n+k)}\right)\\ &=&\frac{1}{k}\cdot\frac{1}{1\cdot 2\cdot\ldots\cdot k}=\frac{1}{k\cdot k!}.\end{eqnarray*}$$ The same telescoping technique applies to the integral: $$I_k = \int_{1}^{+\infty}\frac{dt}{t(t+1)\ldots(t+k)}=\frac{1}{k}\int_{0}^{1}\frac{dt}{(t+1)\ldots(t+k)}$$ and now the RHS can be evaluated through partial fraction decomposition, since: $$\frac{1}{(t+1)\ldots(t+m)}=\frac{1}{(m-1)!}\sum_{j=0}^{m-1}\frac{(-1)^j\binom{m-1}{j}}{t+j+1}.$$ We have $\int_{0}^{1}\frac{dt}{t+h}=\log(h+1)-\log(h)=\log\left(1+\frac{1}{h}\right)$, hence: $$\begin{eqnarray*}I_k &=& \frac{1}{k(k-1)!}\sum_{j=0}^{k-1}(-1)^j\binom{k-1}{j}\left(\log(j+2)-\log(j+1)\right)\end{eqnarray*}$$ just gives your $(3)$ after rearranging terms.