Sorry if this is a naive question; I am not very good at mathematics.
It seems obvious that for many $x$ and $y$, $x^y < y^x$ if $y \ll x$, e.g. $2^{10} > 10^2$. If $x$ and $y$ are very close to each other, this does not hold (simple case: $2^3 < 3^2$).
So - is there really such a regularity? Does it have a name? Is there an exact rule for which $x$ and $y$ it holds (let us, for simplicity, assume both are natural numbers)?
This is an interesting problem. Since you say you're "not very good at mathematics", I'll give the result first and then explain how I came up with it.
The number $2.71828\ldots$ is Euler's constant $e$. This result explains why strange things happen around $2$: as you noticed, $2^4 = 4^2$ and the two number $2^3$ and $3^2$ are very close. But in general, if $x$ and $y$ are close to each other but "far away" from about $2.71828\ldots$ then you can just compare them directly to see if $x^y < y^x$.
In more detail, the inequality $x^y < y^x$ is equivalent to $y\ln x < x\ln y$, by taking logarithms. And that inequality is equivalent to $(\ln x)/x < (\ln y)/y$, by dividing by $xy$. So we can reformulate your original problem to the problem of determining whether the function $f(t) = (\ln t)/t$ is increasing or decreasing. You can show, using calculus, (I'll supply the details if you would like them), that the function $f(t)$ is increasing when $t < e$ and decreasing when $t > e$.