I think I have a proof that $n\ln n$ is optimal in the sense that is it a lower bound for sorting algorithms.
See here for a list.
It must be greater than $n$ as this is too linear, and the $\ln$ factor comes from the harmonic series, which represents the reductions by dividing the sort list.
Has this already been proved (or disproved)?
With $n$ elements there are $n!$ possible orderings. Each comparison you make between two elements reduces the space of possible elements by a half. In order to uniquely determine which ordering we are in, we have to perform enough comparisons to cover all possible orderings.
All orderings can be represented in $\log_2 n!$ bits, which is to say we need $O(\log_2 n!)$ comparisons. Using Stirling's Approximation, $\log_2(n!) \approx \frac1{\ln 2}(n \ln n - n + O(\ln n))$. Asymptotically, $O(\log_2 n!) = O(n \log n)$.