There are some problems with non-trivial lower bounds for working time of algorithm (that solve this problem): sorting, copying words on Turing machine...
What are some modern methods for proving lower bounds for working time? Can you give a reference? Thank you!
UPD: All appearances there aren't well-developed theory about proof of lower bounds of time working. Nonetheless it is very interesting to know some examples of such proofs (except sorting and copying words on Turing machine)
One of the most common methods for proving lower-bounds is known as the adversary-method. The idea is that an adversary is allowed to pick the input "piece by piece" depending on what an algorithm does, which in turn can be used to show that there exists an input where an algorithm will do at-least $\Omega(f(n))$ work. An application, as you've mentioned of this can be applied to sorting:http://goldman.cse.wustl.edu/crc2007/handouts/adv-lb.pdf.
This method has many other applications as well: a brief overview here http://www.cs.uiuc.edu/~jeffe/teaching/algorithms/notes/19-lowerbounds.pdf. As an example, we can use the adversary method to show that to evaluate an AND/OR boolean formula, we may potentially need to query every variable in the worst case (though there exists a nice randomized algorithm with query complexity better than $n$). The adversary method has also been used to prove lower-bounds in Quantum Computing (http://arxiv.org/abs/quant-ph/0002066).