Suppose we have two sorting algorithms which takes $O(n\log n)$ and $O(n^2)$ time. What can we say about it? Is it always better to choose $n\log n$ if the size $n$ is not given? Or can we say on an average $n \log n$ out performs $n^2$. If I want to make one of the algorithm as default sorting algorithm of my system then should I always choose $O(n\log n)$ over $O(n^2)$ .
Please give some input.
No. Suppose sorting algorithm $A$ takes $1000n \log(n)$ steps and algorithm $B$ takes $n^2$ steps and we need to sort 1000 elements. Then $A$ takes $1000 \cdot 1000 \cdot \log(1000)=3000000$ steps, but $B$ takes $n^2=1000^2=1000000$ steps.
However, it is eventually better. For example, for $n=10000$, $A$ is better than $B$.