Expected hitting time of Ehrenfest model

368 Views Asked by At

Suppose we have $N$ balls numbered from $1$ to $N$, and two boxes A and B. First, put the $N$ balls into the two boxes, and let $X_0$ be the number of balls in box A. Then randomly pick one of the $N$ balls and put it into the other box. Denote by $X_n$ the number of balls in A after the $n$th move. Then $\{X_n:n\geq 0\}$ is a Markov chain with transition probability $p_{i,i+1}=1-i/N,\ p_{i,i-1}=i/N$. Let $\sigma_j=\inf\{n\geq 0:X_n=j\}$. Now we want to know, when $N=2d$, the behavior of the expected hitting time $E_0\sigma_d=E(\sigma_d|X_0=0)$.

It is said in a book on stochastic processes that $E_0\sigma_d\leq d+d\ln d+O(1)$. I tried to get this by writing all the equations between $E_i\sigma_d$ and solving the equations to get such a estimate. But it seems the direct computation is quite hard, so I am looking for some cleverer way to show this.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $N \geq 1$ be arbitrary. Then one can check that

$$ \forall \ 0 \leq i < N \ : \quad \mathbb{E}_i[\sigma_{i+1}] = \frac{\sum_{k=0}^{i} \binom{N}{k}}{\binom{N-1}{i}}. \tag{*}$$

Now using the estimate $ \binom{N}{i-k}\big/\binom{N}{i} = \prod_{l=1}^{k} \frac{i-l+1}{N-i+l} \leq \left( \frac{i}{N-i} \right)^k $, for $i < N/2$ it follows that

$$ \mathbb{E}_i[\sigma_{i+1}] = \frac{N}{N-i} \sum_{k=0}^{i} \frac{\binom{N}{i-k}}{\binom{N}{i}} \leq \frac{N}{N-i} \sum_{k=0}^{\infty} \left( \frac{i}{N-i} \right)^k = \frac{N}{N-2i}. $$

Now let $N = 2d$, then

$$ \mathbb{E}_0[\sigma_d] = \sum_{i=0}^{d-1} \mathbb{E}_i[\sigma_{i+1}] \leq \sum_{i=0}^{d-1} \frac{d}{d-i} = d \log d + \gamma d + O(1), $$

where $\gamma$ is the Euler-Mascheroni constant.


Remark. A better asymptotic formula is given by

$$ \mathbb{E}_0[\sigma_d] = \frac{1}{2}d\log d + \left( \frac{\gamma}{2} + \log 2 \right)d + o(d). $$

I realized that I already obtained this result before. This link contains a proof.


Derivation of $\text{(*)}$ - 1st way. Write $a_i = \mathbb{E}_i[\sigma_{i+1}]$. Then $\mathbb{E}_i[\sigma_N] = a_i + \cdots+ a_{N-1} $ for $i < N$ and by the Markov property we obtain

$$ \mathbb{E}_i[\sigma_N] = 1 + \left(1-\frac{i}{N}\right)\mathbb{E}_{i+1}[\sigma_N] + \frac{i}{N}\mathbb{E}_{i-1}[\sigma_N]. $$

Plugging the above identity and rearranging, we obtain the following recurrence relation

$$ \left(1-\frac{i}{N}\right) a_i = \frac{i}{N} a_{i-1} + 1. $$

Solving this equation together with $a_0 = 1$ gives the desired identity. ////

Derivation of $\text{(*)}$ - 2nd way. Notice that $\pi(i) = \binom{N}{i}$ is an invariant measure and the chain is reversible with respect to $\pi$. This induces an electric network on a the chain with the conductances $c_{i,j} = \pi(i)p_{i,j}$. Now by the hitting time identity, we obtain

$$ \mathbb{E}_i[\sigma_{i+1}] = \mathcal{R}_{\text{eff}}(i,i+1)\sum_{x=0}^{N}\pi(x) v(x), $$

where $\mathcal{R}_{\text{eff}}(i,i+1)$ is the effective resistance between $i$ and $i+1$ and $v$ is the unit voltage from $i$ to $i+1$. Now it is immediate that

$$ \mathcal{R}_{\text{eff}}(i,i+1) = \frac{1}{c_{i,i+1}} = \frac{1}{\binom{N-1}{i}} \qquad \text{and} \qquad v(x) = \mathbb{P}_x(\sigma_i < \sigma_{i+1}) = \mathbf{1}_{\{1,\cdots,i\}}(x). $$

Plugging this back yields $\text{(*)}$ as required. ////