Suppose we have a simple random walk on a graph, or possibly more generally a reversible Markov chain, that begins at vertex v and continues until vertex u is first reached. What is the expected number of visits to vertex v?
Supposedly this is a well known result, for example,

Is there an elementary proof of this fact (not relying on theorems about stopping times)? I am trying to present this paper in the most self contained way possible. It seems we could do something along the lines of letting f(x) be the expected number of visits to v (at time greater than 0) starting at x before u is reached.
Then $f(x) = P_{x,v}(1 + f(v)) + \sum_{w \neq u,v} P_{x,w}f(w)$
This set of equations bares some resemblance to the stationary distribution equations, although f(u) is not well defined. Is there a way to solve this?