The question is in the title: is it possible to simplify the complexity $O(n\log{}n + \frac{n^2}{m}\log{}m)$ ? $n$ and $m$ are two variables, and you know that $n > m$ (by the way, what if we don't know that?).
What I first thought was that it is possible to reduce it by keeping only the second term: $O(\frac{n^2}{m}\log{}m)$, because the term $n^2$ is dominating the term $n$, even if we divide $n^2$ by $m$. But the more I think about it, the less confident I am about this.
Any idea/explanation? That would help a lot.
Thanks!
In general, if you have two parameters $m$ and $n$ that can be independently adjusted, you are stuck with two parameters. Sometimes $m$ and $n$ are naturally tied together (maybe they're roughly proportional in most cases, or maybe $m$ is generally $O(1)$ and its typical values don't depend much on $n$) and then you can reduce it.
For instance if you assume $m$ does not change and you are interested in the scaling of complexity with $n,$ then your approach of fixing $m$ and regognizing that the second term scales faster than the first would be valid.
However, all you are told is that $m<n,$ you're not told 'how much less'. In addition, this is an interesting circumstance when the second term actually decreases with $m.$ So the best case runtime is when $m = \Theta(n).$ In this case we can plug $n=m$ into the second term and see that it's $n\log(n)$ just like the first. The worst case is the case I mentioned before where $m$ stays $O(1)$ and then the overall complexity is $O(n^2).$
All of this assumes that $m$ is (loosely) a function of $n$ in the sense that typical values of $m$ depend on $n$ in a predictable way. As I said at the beginning, if there's no such relationship and the $m$ and $n$ are pretty much independent, two parameters means two parameters.