From a textbook on computational problems, there's a question I've been pondering...
Q: If a priority queue has operations to add a value and show/remove the smallest value, show that for an implementation, which only accesses data by copying and comparing it, that the worst case complexity of an operation is $\Omega$(log n).
I have some questions about this. Firstly, I'm a little unsure as to the importance of the "accesses but does not mutate" bit of information. In what way would this affect the complexity? There is a hint that suggests that a result from a previous page that sorting under the same restrictions requires $\Omega$ (log n), which uses some logarithmic integration/simplification but I'm not sure how to take that result and make use of it here.
So far I'm stuck on a solution. Clearly the removal operation must have some way to compare the functions, but surely that means it should be bounded by $\Omega$ (n) since a naive solution would involve running each element of the queue to find the smallest?
First, remember that the $\Omega$ notation gives a lower bound for the complexity, so if you have an implementation where every operation takes $cn$ time for some constant $c$, that complexity is $\Omega(\log n)$ too!
Second, if you have a priority queue implementation that works under the given restrictions, you can use it to build a comparison sort algorithm: just insert all the elements you want to sort into the priority queue, and then extract them one by one.
Now, do an indirect proof: Suppose we had a priority queue whose worst-case complexity was not $\Omega(\log n)$ (in other words, it is better than $\log n$). Then we could build a sorting algorithm whose complexity would be not $\Omega(n \log n)$ -- there are some proof details to get right here; out of laziness I will leave those to you. But you already know that comparison sorts are always $\Omega(n \log n)$ for the worst case, so the original too-good-to-be-true priority queue cannot exist!