P.S. I thought $\mathcal F_0$ was always trivial...sounds like something a book should say in earlier chapters.
for 4.5.3:
the adapted thing is weird. adapted sounds like something you use to describe random variables not events. I think the events $\{B_n\}$ are adapted to the filtration $\{\mathcal F_n\}$ if and only if the random variables$\{1_{B_n}\}$ are adapted to the filtration $\{\mathcal F_n\}$. So basically the same as 4.3.4 where you had $B_n \in \mathcal F_n$
without the 'on' clause this means
4.1. $\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m} = \limsup \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}$ (I guess surely...or at least almost surely) s.t. we can define $\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}$ as their common value like $[\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega):=[\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega)=[\limsup \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega)$, where $[\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega):=\liminf [\frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}(\omega)]$ and similar for $\limsup$
4.2. and $\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m} = 1$ almost surely, i.e. $P(\omega \in \Omega | [\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega) = 1) = 1$
4.3. I think $\lim \frac{A_n}{B_n} = 1$ almost surely if and only if $\lim A_n = \lim B_n$ almost surely aaaand I guess $B_n(\omega) \ne 0$ for all $n$ and for all $\omega$ so...
4.4. ...I guess that $p_m(\omega) \ne 0$ for all $m$ and for all $\omega$...or at least $\sum_{m=1}^n p_m \ne 0$ for all $n$ and for all $\omega$.
Now including the 'on' clause...hmmm...this appears to be basically an 'if' clause, sooo..... I guess $$\{\omega \in \Omega | \sum_{m=1}^{\infty} p_m(\omega) = \infty \} \subseteq \{\omega \in \Omega | [\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega) = 1\}$$
Sounds pretty much like saying if $\sum_{m=1}^{\infty} p_m = \infty$ a.s., then $\sum_{m=1}^\infty 1_{B_m} = \infty$ a.s. I didn't learn this in probability class, but maybe this can be understood in some examples of $\{B_n\}$ and $\{\mathcal F_n\}$ s.t.
6.1. $\sum_{m=1}^{\infty} p_m$ that is not almost surely infinite but still $\sum_{m=1}^\infty 1_{B_m} = \infty$ a.s.
6.2. $\sum_{m=1}^{\infty} p_m$ that is not almost surely infinite and $\sum_{m=1}^\infty 1_{B_m} = \infty$ is not almost surely infinite
It's been awhile since I've done probability, but...
this appears to say
for 4.3.4:
$\limsup B_n = \{\sum_{n=1}^{\infty} E(1_{B_n}|\mathcal F_{n-1}) = \infty\}$
P.S. I thought $\mathcal F_0$ was always trivial...sounds like something a book should say in earlier chapters.
for 4.5.3:
the adapted thing is weird. adapted sounds like something you use to describe random variables not events. I think the events $\{B_n\}$ are adapted to the filtration $\{\mathcal F_n\}$ if and only if the random variables$\{1_{B_n}\}$ are adapted to the filtration $\{\mathcal F_n\}$. So basically the same as 4.3.4 where you had $B_n \in \mathcal F_n$
without the 'on' clause this means
4.1. $\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m} = \limsup \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}$ (I guess surely...or at least almost surely) s.t. we can define $\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}$ as their common value like $[\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega):=[\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega)=[\limsup \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega)$, where $[\liminf \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega):=\liminf [\frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}(\omega)]$ and similar for $\limsup$
4.2. and $\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m} = 1$ almost surely, i.e. $P(\omega \in \Omega | [\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega) = 1) = 1$
4.3. I think $\lim \frac{A_n}{B_n} = 1$ almost surely if and only if $\lim A_n = \lim B_n$ almost surely aaaand I guess $B_n(\omega) \ne 0$ for all $n$ and for all $\omega$ so...
4.4. ...I guess that $p_m(\omega) \ne 0$ for all $m$ and for all $\omega$...or at least $\sum_{m=1}^n p_m \ne 0$ for all $n$ and for all $\omega$.
Now including the 'on' clause...hmmm...this appears to be basically an 'if' clause, sooo..... I guess $$\{\omega \in \Omega | \sum_{m=1}^{\infty} p_m(\omega) = \infty \} \subseteq \{\omega \in \Omega | [\lim \frac{\sum_{m=1}^n 1_{B_m}}{\sum_{m=1}^n p_m}](\omega) = 1\}$$
Sounds pretty much like saying if $\sum_{m=1}^{\infty} p_m = \infty$ a.s., then $\sum_{m=1}^\infty 1_{B_m} = \infty$ a.s. I didn't learn this in probability class, but maybe this can be understood in some examples of $\{B_n\}$ and $\{\mathcal F_n\}$ s.t.
6.1. $\sum_{m=1}^{\infty} p_m$ that is not almost surely infinite but still $\sum_{m=1}^\infty 1_{B_m} = \infty$ a.s.
6.2. $\sum_{m=1}^{\infty} p_m$ that is not almost surely infinite and $\sum_{m=1}^\infty 1_{B_m} = \infty$ is not almost surely infinite