Random walk on a cube (want mean number of visits)

124 Views Asked by At

Start a random walk on a vertex $v$ of a cube, with equal probability going along the three edges that you can see (to the diametrically opposite vertex $w$).

My question is: What is the expected/mean number of visits to the opposite vertex $w$ before its first return to $v$?

I only know how to use Markov property to calculate the mean number of steps until its first return to $v$, or the mean number of steps until its first visit to $w$... The methodology for my question seems quite different. Thanks for help.


Edit: My lecturer wrote that,

if we denote the number of visits to $w$ before its first return to $v$ by a random variable $X$, then $P(X=n)=\theta^{n-1}\theta^2$, where $\theta$ is the probability that the first return of the walk to its starting point precedes its first visit to the diametrically opposed vertex.

I cannot understand why the pmf of $X$ is like this. To be more specific, I don't know the intuition behind the expression of the pmf. Can anyone help me understand the formula of the pmf? Thanks.

1

There are 1 best solutions below

1
On BEST ANSWER

There are different answers depending on whether the walk is required to visit $w$ at least once before returning to $v$, or not. The hint given by your lecturer suggests that they are thinking of the second interpretation. For $X$ to equal $n \ge 1$, the walk first has to reach $w$ from $v$ before returning to $v$ (this has probability $1-\theta$), then to make $n-1$ excursions from $w$ to itself that do not reach $v$ (each such excursion has probability $\theta \, $ by symmetry, and they are independent), and finally to
reach $v$ from $w$ before revisiting $w$ (this has probability $1-\theta$ by symmetry again.) Thus $$P(x=n)=(1-\theta)^2 \theta^{n-1} \,,$$ which is a little different from the formula you quoted. Therefore,

$$E(X)=\sum_{n=1}^\infty n (1-\theta)^2 \theta^{n-1}= (1-\theta)^2 \frac{d}{d \theta}\Bigl(\sum_{n=0}^\infty \theta^n \Bigr) =1 \,.$$

In fact, this is a special case of a very general identity: For simple random walk (started at a vertex $v$) on any connected regular graph (where all vertices have the same degree), the expected number of visits to $w$ before returning to $v$ is one. More generally, this holds for every irreducible Markov chain with a uniform stationary distribution. See, e.g., Prop. 1.14 (page 11) in [1].

[1] https://www.yuval-peres-books.com/markov-chains-and-mixing-times/