Problem: Let $\mathbf{X} = (\mathbb{Z}_2)^\mathbb N$, i.e., $\mathbf{X} = (X_1,X_2,\cdots,X_N,\cdots)$, $X_i\in \{0,1\} $. It can be considered as countable lightbulbs. $0$ means off, $1$ means on. We start with $\mathbf{X}_0 = 0$. Keep generating independent geometric random variables, whose distribution are $geom(1/2)$. Denote them as $K_1, K_2,\cdots$. Now let $\mathbf{X}_m$ (for $m \ge 1$) be as follows $$(\mathbf{X}_m-\mathbf{X}_{m-1})_k = \mathbf{1}(k = K_m), $$ i.e, in the $m$- th turn, we only change the status of the $K_m$-th light bulb. Then what is the probability of all lights being off again, i.e., $$\mathbb P(\exists m>1, \mathbf{X}_m =0)$$
My first intuition below was wrong I’m afraid.
My intuition is that since $1/4+1/16+1/64+\cdots=1/3$, the final answer might be $1/2$. Since all lights being off means all lights are encountered even times, I tried to use generating function but it doesn't work since it's hard to derive a correspondence between all lights are encountered even times and generating function coefficients, and brute force calculation seem also hard due to the same reason that there're too many cases to consider, and I'm stuck here. Are there any other thoughts to deal with this question? Thanks!
New intuition: We may define a function $f(x_1,x_2,\cdots)$ as the probability that status $\mathbf{X}=(x_1,x_2,\cdots)$ will eventually become all $0$, then our goal is to calculate $\sum_{n=1}^{\infty}\frac{1}{2^n}f(0,\cdots,0,1,0,\cdots)$ where $1$ only appears on $n$-th term, and we can achieve all the transporting equations of $f$, which is an uncountable dimensional equation system. We can see immediately that the equation system has a solution $f(x_1,x_2,\cdots)\equiv 1$. Can we conclude that this solution is unique (Then the answer to this problem is $1$)?
Continue the intuition above, we define $$g(x_1,x_2,\cdots)=f(x_1,x_2,\cdots)-1,$$ moreover we define $g(0,0,\cdots,0,\cdots)=0$.
Obviously $f\le1$, hence $g$ can achieve a maximum at $(0,0,0,\cdots,0,\cdots)$. Note that after changing all the equations involving $f$ into $g$, the constant term will be cancelled and the coefficients at the righthand side is strictly larger than $0$ and the sum of all coefficients are always $1$. For example, the equation
$$f(1,0,0,\cdots,0,\cdots)=\frac{1}{2}+\frac{1}{4}f(1,1,0,\cdots,0,\cdots)+\frac{1}{8}f(1,0,1,\cdots,0,\cdots)+\cdots$$
is changed into
$$g(1,0,0,\cdots,0,\cdots)=\frac{1}{2}g(0,0,0,\cdots,0,\cdots)+\frac{1}{4}g(1,1,0,\cdots,0,\cdots)+\frac{1}{8}g(1,0,1,\cdots,0,\cdots)+\cdots$$
We want to use the maximum principle, but we lack an equation with $\text{LHS}=g(0,0,0,\cdots,0,\cdots)$, are there any ways to supple this equation?
Edit: Another possible intuition from here: ignore $g(0,0,0,\cdots,0,\cdots)$, we only focus on the remaining term. We wish to discard the term that has infinity $1$ since the probability of getting this term is $0$. If the maximum happens on one remaining element we're done, what we don't want is if the maximum happens on the limiting term, and we wish to prove that this won't happen because the limit is (somewhat?) decaying. (I don't know how to proceed here, but I'll post my intuitions these days anyway)

It already accumulated more than 10 upvotes, so it is, probably, time to post a solution. Leonbloy essentially said everything there was to say: it is a Markov chain, so showing that the probability in question is $1$ is equivalent to showing that it is recurrent. The standard recurrence criterion is $\sum_m P_m=+\infty$ where $P_m$ is the probability to have all the lights off after $m$ steps. The only issue is finding (or estimating) $P_m$.
By a direct computation, $$ P_m=\sum_{(k_1,\dots,k_m)\in Q_m} 2^{-(k_1+\dots+k_m)} $$ where $Q_m$ is the set of $m$-tuples in which every number occurs even number of times. The direct combinatorial evaluation of $P_m$ seems not easy (unless I'm missing some trick) but it is clear that the same quantity can be expressed as $$ E\left[\sum_{k\ge 1} 2^{-k}Y_k\right]^m $$ where $Y_k$ are independent random variables taking the values $\pm 1$ with probability $1/2$ (just open the parentheses and distribute the expectation). For even $m$, this expectation is at least $$ P\{Y_1=Y_2=\dots=Y_\ell=1\}(1-2^{-(\ell-1)})^m=2^{-\ell}(1-2^{-(\ell-1)})^m $$ for every $\ell>0$. Optimizing over $\ell$, we get a lower bound for $P_m$ comparable to $1/m$ for even $m$, so the series $\sum_m P_m$ diverges and we are done.