I've come across this observation when I was trying to solve a programming task and I was wondering if there's a proof for it. I tried proving it myself but lacking a proper mathematical education I haven't really got anywhere. The statement is as following:
Given two natural numbers $x$ and $y$ with $x < y$ and $x^y > y^x$, $x^{y+k} > (y+k)^x$ is true for every $k > 0$
Thanks in advance!
Take logarithms. $$x^y>y^x\iff y\log x>x\log y\iff \frac {\log x}x>\frac {\log y}y$$ and similarly, $$x^{y+k}>(y+k)^x\iff \frac{\log x}x>\frac{\log(y+k)}{y+k}$$ so we consider the function $f(t)=\frac{\log t}t$ where $t>0$.
A little calculus shows that $f$ increases from $-\infty$ to $\frac1e$ on $(0,e]$ and then decreases to $0$ on $[e,\infty)$. So, if $x<y$ and $f(x)>f(y)$ it must be that $y>e$. Then if $z>y$, we have $f(x)>f(y)>f(z)$.
Take $z=y+k$.
Here's a graph of $f(t)$.