Studying for an algorithms test, and a surprisingly simple problem has me somewhat stumped. The following is the question:
My issue has been - looking online, everyone uses this kind of problem to segway into talking about Dijkstra's algorithm. That's all fine and good, put Dijkstra I find to be a single-source algorithm that finds ALL shortest paths. That is powerful, but it also is not O(V+E).The runtime of Dijkstra's is, of course, O(V+E logV).
I'm trying to envision how one would do a "single run" of Dijkstra's, terminating at the target node, while GUARANTEEING the O(V+E) runtime. I have made an example in which I terminate Dijkstra's prematurely and it happened to look O(V+E)... but I don't know how I could really justify the correctness and analyze the running time truthfully - surely there is some graph where my algorithms runtime is poorer.
Use two stacks $S_0$, $s_1$ and a 2bit flag $f[v]$ per vertex
Step 1 take $O(|V|)$. Each vertex can be pushed to some stack at most twice (because each pushing decreases $f[v]$, except for the initial pushing of $s$). Therefore step 6 sees each edge at most four times (twice per endpoint). Thus the overall runtime is $O(|V|+|E|)$. For the validity note that $S_0$ and $S_1$ keep track of vertices at distance $d$ and $d+1$, respectively.