Let $ (X_n)_{n \geq 1} $ be a sequence of Bernoulli random variables of parameter $p$
$A_n : = \{ \omega \in \Omega : X_n(\omega) \neq X_{n-1}(\omega) \} $
$ v(\omega) : = \inf \{ n \geq 2 : \omega \in A_n \} $ with $ \inf \{ \emptyset \} = - \infty $
We want to prove that $ v$ is a random variable, find its law, and prove that $P(v = +\infty) = 0$
The way I started thinking about it was
$ v(\omega) : = \inf \{ n \geq 2 : \omega \in A_n \} $
$ v(\omega) : = \inf \{ n \geq 2 : X_n(\omega) \neq X_{n-1}(\omega) \} $
I was hoping I'd apply some result of a decreasing or increasing sequence of sets, but I didn't get to a pertinent result, the solution starts by saying :
For all $n \geq 2 $ , we have that $ \{ v = n \} = \cap^{n-2}_{i=1} \{ X_i = X_{i+1} \} \cap \{ X_{n-1}\neq X_n \}$
Which I don't see how it came, which is what this question is about.
For more context, the solution then it affirms that :
since the $X_n$ are random variables , the sets $\{ X_i = X_{i+1} \}$ , $i\leq n $ and $\{ X_{n-1} \neq X_{n} \}$ are elements of the $\sigma$ - algebra $F$ (with which we endowed $\Omega$ ). So it's also the case of $\{ v = n \}$ which shows that $v$ is a random variable.
Then it goes on saying that :
For all $n \geq 2 $ , we have that
$P( v = n) = P(X_1 = X_2 \dots = X_{n-1} = 1)P(X_n = 0)$ $ + P(X_1 = X_2 \dots = X_{n-1} = 1)P(X_n = 1)$ $ = (1-p)^{n-1}p + p^{n-1}(1-p) $ which caracterises the law of $v$
Let's calculate the probability of the complementary event $\{ v < + \infty \}$
$P(v < \infty) = P( \cup^{\infty}_{n=2} \{ v= n \})$
$= \sum^\infty_{n=2} ( (1-p)^{n-1}p + p^{n-1}(1-p) ) = 1 $
From which we conclude that $ P(v=\infty)=0 $
For an arbitrary $\omega\in\Omega$ and a fixed integer $n\geq2$ the following statements are equivalent:
This justifies the conclusion that: $$\{v=n\}=\{X_n\neq X_{n-1}\}\cap\bigcap_{k<n}\{X_k=X_{k-1}\}$$