I'm trying to solve the following problem:
What is the smallest value of n so that an algorithm with a runtime of $5000 n \log(n)$ runs faster than an algorithm with a runtime of $2^{2/n}$ on the same computer?
So I figured it was just a matter of solving the equation $5000 n \log(n) \leq 2^{n/2}$ but I have no idea how I should do it, I've tried for a while using algebra but that's leading me nowhere.
Take log of both sides (I'm assuming that "log" here means $\log2$) to get $$ \log(5000) + \log(n) + \log \log n \le \frac{n}{2} $$ Since $\log n$ and $\log \log n$ are both small compared to $n$, you can say that $n$ is somewhere around $2\log(5000) \approx 25$. So start with $n = 20$ and $n = 52$, say; for one of these the left side should be smaller than the right; for the other, it should be greater. Apply bisection at most 6 times to get your answer.