I've been asked to code two instances of Dijkstra's Algorithm. The first instance should determine the shortest path length between node 0 and each other node in the graph (in order of increasing path length) and then return the path length of a specified end node. The second instance uses the same code, but terminates as soon as the specified end node is reached.
I know that the first instance has a theoretical complexity of $O(nm)$ (where $n=|V|$ and $m=|E|$), but I can't work out the complexity of the second instance.
I have a feeling it might be either $O(n)$ or $O(n^2)$, can someone either confirm or correct this?