I am learning about sort algorithms and their complexities.
For merge sort, I'm confronted with $T(n) \leq 2 T(\lceil \frac{n}{2} \rceil)+\mathcal{O}(n)$.
The author makes the claim with no justification that $T_n=\mathcal{O} (n\log(n))$.
Why is that ?
I don't have any know-how about such things.
Firstly, why is it intuitive that $T_n=\mathcal{O} (n\log(n))$ ?
Secondly, what techniques can be applied to solve such problems ?
Here is a heuristic argument. It should be made precise.
For the moment, forget about rounding in the argument of $T$. Now, as you successively expand the recursive definition, you get:
$\begin{array}%T(n)&\leq 2\,T\left(\frac{n}{2}\right)+\mathcal{O}(n)\\&\leq2\left(2\,T\left(\frac{n}{4}\right)+\mathcal{O}\left(\frac{n}{2}\right)\right)+\mathcal{O}(n)=4\,T\left(\frac{n}{4}\right)+\mathcal{O}(2n)\\&\leq4\left(2\,T\left(\frac{n}{8}\right)+\mathcal{O}\left(\frac{n}{4}\right)\right)+\mathcal{O}(2n)=8\,T\left(\frac{n}{8}\right)+\mathcal{O}(3n)\\&\vdots\\&\leq2^k\,T\left(\frac{n}{2^k}\right)+\mathcal{O}(kn).\end{array}$
As $k$ reaches $\log_2(n)$, this inequality turns into$$T(n)\leq n\,T(1)+\mathcal{O}(n\log_2(n))=\mathcal{O}(n\log_2(n)).$$