Possibility that all lights $\mathbf{X}=(X_1,X_2,\cdots)$ turn off again with every time turn a light with its number $n\sim\text{geom}(\frac{1}{2})$.

462 Views Asked by At

Problem: Let $\mathbf{X} = (\mathbb{Z}_2)^\mathbb N$, i.e., $\mathbf{X} = (X_1,X_2,\cdots,X_N,\cdots)$, $X_i\in \{0,1\} $. It can be considered as countable lightbulbs. $0$ means off, $1$ means on. We start with $\mathbf{X}_0 = 0$. Keep generating independent geometric random variables, whose distribution are $geom(1/2)$. Denote them as $K_1, K_2,\cdots$. Now let $\mathbf{X}_m$ (for $m \ge 1$) be as follows $$(\mathbf{X}_m-\mathbf{X}_{m-1})_k = \mathbf{1}(k = K_m), $$ i.e, in the $m$- th turn, we only change the status of the $K_m$-th light bulb. Then what is the probability of all lights being off again, i.e., $$\mathbb P(\exists m>1, \mathbf{X}_m =0)$$

My first intuition below was wrong I’m afraid.

My intuition is that since $1/4+1/16+1/64+\cdots=1/3$, the final answer might be $1/2$. Since all lights being off means all lights are encountered even times, I tried to use generating function but it doesn't work since it's hard to derive a correspondence between all lights are encountered even times and generating function coefficients, and brute force calculation seem also hard due to the same reason that there're too many cases to consider, and I'm stuck here. Are there any other thoughts to deal with this question? Thanks!

New intuition: We may define a function $f(x_1,x_2,\cdots)$ as the probability that status $\mathbf{X}=(x_1,x_2,\cdots)$ will eventually become all $0$, then our goal is to calculate $\sum_{n=1}^{\infty}\frac{1}{2^n}f(0,\cdots,0,1,0,\cdots)$ where $1$ only appears on $n$-th term, and we can achieve all the transporting equations of $f$, which is an uncountable dimensional equation system. We can see immediately that the equation system has a solution $f(x_1,x_2,\cdots)\equiv 1$. Can we conclude that this solution is unique (Then the answer to this problem is $1$)?
Continue the intuition above, we define $$g(x_1,x_2,\cdots)=f(x_1,x_2,\cdots)-1,$$ moreover we define $g(0,0,\cdots,0,\cdots)=0$.
Obviously $f\le1$, hence $g$ can achieve a maximum at $(0,0,0,\cdots,0,\cdots)$. Note that after changing all the equations involving $f$ into $g$, the constant term will be cancelled and the coefficients at the righthand side is strictly larger than $0$ and the sum of all coefficients are always $1$. For example, the equation $$f(1,0,0,\cdots,0,\cdots)=\frac{1}{2}+\frac{1}{4}f(1,1,0,\cdots,0,\cdots)+\frac{1}{8}f(1,0,1,\cdots,0,\cdots)+\cdots$$ is changed into $$g(1,0,0,\cdots,0,\cdots)=\frac{1}{2}g(0,0,0,\cdots,0,\cdots)+\frac{1}{4}g(1,1,0,\cdots,0,\cdots)+\frac{1}{8}g(1,0,1,\cdots,0,\cdots)+\cdots$$ We want to use the maximum principle, but we lack an equation with $\text{LHS}=g(0,0,0,\cdots,0,\cdots)$, are there any ways to supple this equation?

Edit: Another possible intuition from here: ignore $g(0,0,0,\cdots,0,\cdots)$, we only focus on the remaining term. We wish to discard the term that has infinity $1$ since the probability of getting this term is $0$. If the maximum happens on one remaining element we're done, what we don't want is if the maximum happens on the limiting term, and we wish to prove that this won't happen because the limit is (somewhat?) decaying. (I don't know how to proceed here, but I'll post my intuitions these days anyway)

3

There are 3 best solutions below

1
On BEST ANSWER

It already accumulated more than 10 upvotes, so it is, probably, time to post a solution. Leonbloy essentially said everything there was to say: it is a Markov chain, so showing that the probability in question is $1$ is equivalent to showing that it is recurrent. The standard recurrence criterion is $\sum_m P_m=+\infty$ where $P_m$ is the probability to have all the lights off after $m$ steps. The only issue is finding (or estimating) $P_m$.

By a direct computation, $$ P_m=\sum_{(k_1,\dots,k_m)\in Q_m} 2^{-(k_1+\dots+k_m)} $$ where $Q_m$ is the set of $m$-tuples in which every number occurs even number of times. The direct combinatorial evaluation of $P_m$ seems not easy (unless I'm missing some trick) but it is clear that the same quantity can be expressed as $$ E\left[\sum_{k\ge 1} 2^{-k}Y_k\right]^m $$ where $Y_k$ are independent random variables taking the values $\pm 1$ with probability $1/2$ (just open the parentheses and distribute the expectation). For even $m$, this expectation is at least $$ P\{Y_1=Y_2=\dots=Y_\ell=1\}(1-2^{-(\ell-1)})^m=2^{-\ell}(1-2^{-(\ell-1)})^m $$ for every $\ell>0$. Optimizing over $\ell$, we get a lower bound for $P_m$ comparable to $1/m$ for even $m$, so the series $\sum_m P_m$ diverges and we are done.

1
On

I believe that with probability $1$ the lights will all be off again at some point. The argument:


For $n$ even, let $p_n$ be the probability that all the light bulbs are off after $n$ steps. The main idea of our argument will be to show that

Claim: There is a constant $C>0$ such that $$p_n \geq \frac{C}{n}$$

Why is the Claim Enough? If the claim is true, then the expected number of times at which all the bulbs are off is infinite (The Harmonic Series Diverges), which implies that the probability all the bulbs are off at some point is $1$. (This is a standard idea for random walks -- if the probability of return was $p<1$, then the expected number of returns would be $p+p^2+p^3+\dots$, which would be finite).

A reformulation of the Problem: Consider sets $A_1, A_2, \dots, $ generated as follows:$A_1 = \{1, \dots, n\}$. $A_{k+1}$ is generated from $A_k$ by independently keeping each element with probability $\frac{1}{2}$.

We can think of $A_i$ here as corresponding to all the times at which we change a bulb located in position $i$ or later. The bulbs will all be off exactly when all of the $A_i$ are of even cardinality. So we're trying to bound the probability we keep on getting sets of even cardinality until we reach the empty set (at which point all subsequent sets are automatically even cardinality)

The Intuition: Why $\frac{C}{n}$? If we're given $A_k$ (and $A_k$ is non-empty), then $A_{k+1}$ is a randomly chosen subset of $A_k$. Exactly half of those subsets are non-empty, so we "survive" (our next set has even cardinality) with probability $\frac{1}{2}$. On the other hand, our set decreases in size by about $\frac{1}{2}$ each time we take a subset, so it'll take roughly $\log_2 n$ rounds before we're down to the empty set. The chance we survive those rounds should be roughly $2^{\log_2 n} = \frac{1}{n}$.

Making this All Precise: Our intuition above had two parts: We survive with probability $\frac{1}{2}$ with each new subset, and we're done after roughly $\log_2 n$ rounds. The goal of the following lemma is to decouple those two observations in a certain sense.

Lemma: Suppose $A_1$ is given and nonempty, and $t$ is a positive integer. Then $$P(A_2 \textrm{ has even size} | A_t = \emptyset) \geq \frac{1}{2})$$

Proof: We have \begin{eqnarray*} P\left(|A_2| \, \textrm{even} \wedge A_t = \emptyset\right)-P\left(|A_2| \, \textrm{odd} \wedge A_t = \emptyset\right) &=& \sum_{k=0}^n (-1)^k P\left(|A_2|=k\right) P\left(A_t = \emptyset \biggl| |A_2|=k\right) \\ &=& 2^{-n} \sum_{k=0}^n (-1)^k \binom{n}{k} P\left(A_t = \emptyset \biggl| |A_2|=k\right) \\ &=& 2^{-n} \sum_{k=0}^n (-1)^k \binom{n}{k} \left(\left(1-2^{-(t-1)}\right)^k \right) \end{eqnarray*} (each element of $A_2$ dies by $A_t$ with probability $1-2^{-(t-1)}$, and $A_t$ is empty when all of the elements of $A_2$ die)

By the Binomial Theorem, this simplifies down to $$2^{-n} \left( 1 - (1-2^{-(t-1)}) \right)^k > 0$$ So even is at least as likely as odd.

Given this Lemma, we now prove the main claim. Let $m=\lfloor \log_2 n \rfloor +2$. Let $B_1$ be the event that $A_m$ is empty, and $B_2$ be the event that all of $A_1, A_2, \dots, A_m$ are even in size. If $B_2$ and $B_1$ both hold, then all the lights are off at time $n$.

By repeated applications of the lemma, we get that $$P(B_2 | B_1) \geq 2^{-m}$$ It follows that $$P(B_2 \wedge B_1) \geq 2^{-m} P(B_1)$$

We can bound the probability that $A_m$ is non-empty by $$P(B_1^C) \leq n P(1 \in A_m) \leq n 2^{-m} \leq n 2^{-(\log_2 n +1)} \leq \frac{1}{2}$$ It follows that $P(B_1) \geq \frac{1}{2}$, and $$p_n \geq P(B_2 \wedge B_1) \geq \frac{1}{2} 2^{-m} \geq \frac{1}{2} 2^{\log_2 n +2} = \frac{1}{8n}$$

0
On

Let $u_k$ be the probability that all lights are off after $2k$ steps

We decompose that event according to the number of flips done for the first bulb, using the law of total probability and the property that a truncated geometric distribution is again geometric. Letting $2j$ be the number of flips done for the first bulb, we get the following recursion:

$$ u_k =\frac{1}{2^{2k}} \sum_{j=0}^k \binom{2k}{2j} u_{j} \iff u_k=\frac{1}{4^k-1} \sum_{j=0}^{k-1} \binom{2k}{2j} u_{j}\tag{1}$$

with the initial condition $u_0=1$. The solution [*] can be obtained explicitly, and is surprisingly simple:

$$u_{k}=\frac{1}{2k+1} \tag 2$$

Then, the sum $\sum_{k=0}^\infty u_k$ diverges; and by the properties of Markov chains, the state is recurrent, and the probability of returning to zero is $1$.

If we wish to compute the probability of returning with finite steps: Letting $f_k$ be the probability of getting the first "return to zero" (first time lights are off) after $2k$ steps (with $f_0=0$), we have the know relationship $$ u_k = f_1 u_{k-1} + f_2 u_{k-2} +\cdots +f_k u_0 \tag 3$$

I've not found an explicit formula for $f_k$. The first values are $$ \begin{array}{[c|c|c]} k & u_k & f_k\\ 1 & 0.333333 & 0.333333 \\ 2 & 0.200000 & 0.088888 \\ 3 & 0.142857 & 0.046561 \\ 4 & 0.111111 & 0.030194 \\ 5 & 0.090909 & 0.021797 \\ \end{array} $$

The convergence is very slow, though, $\sum_{j=1}^k f_k \to 1 - \frac{a}{\log(k)}$ and the probability of having returned to zero after $1400$ steps is still below $0.8$.

enter image description here


  • Added: the solution of $(1)$:

Let $S(x)= \left( 1+x \right)^{2k}=\sum_{i=0}^{2k} \binom{k}{i} x^i $ and

$$G(x)= \frac12 \left( S(x) + S(-x) \right) = \sum_{j=0}^{k} \binom{2k}{2j} x^{2j}$$

Then evaluating $\int_0^1 G(x) dx $ we get $$ \sum_{j=0}^{k} \binom{2k}{2j} \frac{1}{2j+1}={2}^{2k} \frac{1}{2 k+1} $$

Hence indeed $(2)$ is solution of $(1)$, with the right initial condition $u_0=1$.