Let $z_1, z_2, z_3,$ ...be a sequence of independent random variables s.t. $P(z_i=1)=P(z_i=-1)=\frac 1 2$. Does this sequence converge almost surely?
I am still bothered by the definition of almost sure convergence. If we define a RV Z($\omega_1$=negative sign)=-1, Z($\omega_2$=positive sign)=1 with sample space {$\omega_1$, $\omega_2$} doesnt this sequence then converge to Z? Since P(lim n→∞ Zi(w))=Z(w)=1? But the answer should be no, so I am confused bout this defintion (apology in advnace as I am new to this.)
I think i know how to use second bernoulli to prove it since sum of probabilities which is 1/2 will be infinite meaning that the sequence does not converge. But would some one please explain the above definition?
If the $Z_i$ are iid then it cannot be that the underlying sample space $\Omega$ will only have two elements $\omega_1$ and $\omega_2$ as you seem to think.
So we do not have something like $Z_i(\omega_1)=Z(\omega_1)=-1$ and $Z_i(\omega_2)=Z(\omega_2)=1$.
Note that - if $\Omega=\{\omega_1,\omega_2\}$ - then: $$P(Z_1=1\wedge Z_2=1)=P(\{\omega_2\})=\frac12\neq\frac14=P(\{\omega_2\})\times P(\{\omega_2\})=P(Z_1=1)P(Z_2=1)$$contradicting that $Z_1$ and $Z_2$ are independent.
The $Z_i$ have identical distribution (so converge in distribution), but secondly are independent.
Convergence almost surely to $Z$ requires that $$P(\{\omega\in\Omega\mid \lim_{n\to\infty} Z_n(\omega)=Z(\omega)\})=1\tag1$$
Assume that this is true.
Based on $(1)$ it can be shown in the first place that $Z_n$ converges in distribution to $Z$ so that in this context also $Z$ must have the same distribution as the $Z_n$.
Secondly it can be shown that $P(Z_n=1\wedge Z_{n+1}=1)$ converges to $P(Z=1\wedge Z=1)=P(Z=1)$.
But by independency we have $P(Z_n=1\wedge Z_{n+1}=1)=\frac14$ and because $Z$ has the same distribution as the $Z_n$ we have $P(Z=1)=\frac12$.
So we conclude that the assumption is not correct: the independent $Z_n$ will not converge almost surely to some $Z$.