I want to show that the optional sampling theorem does not hold for unbounded stopping times using the example of the St. Petersburg paradox/St. Petersburg game. We have a consecutive (fair) coin toss and play until we win for the first time. In round one you bet $1$ unit of money. You lose it if you lose the game and you keep it if you win the game. In the next rounds the stakes are always doubled.
I want to consider the total win/loss process which will be a martingale, namely let $(\xi_k)_{k\in\mathbb{N}}$ with $\xi_k\in\{-1,1\}$ an i.i.d. sequence of random variables corresponding to losing/winning a round. The bet amount will be $(b_k)_{k\in\mathbb{N}}$ with $b_1=1$ and $b_k=2^{k-1}$. Then the total process is $X_k=\sum_{j=1}^k b_k \xi_k$. One can easily check that this is a martingale (with respect to $\mathcal{F}_k^X=\sigma(\{X_1,...,X_k\})$).
The optional sampling theorem says that for two bounded $\mathcal{F}^X$-stopping times $\sigma\le\tau$ one has $\mathbb{E}(X_{\tau}|\mathcal{F}_{\sigma}^X)=X_{\sigma}$.
Now I want to use the time of the first win $\tau=\inf\{k\in\mathbb{N}|\xi_k=1\}$ (which is an unbounded $\mathcal{F}^X$-stopping time) to show that the optional sampling theorem does not hold.
The problem is that I don't know how to choose my second stopping time. While $\sigma=\tau-1$ would work nicely, it is not allowed because it is not a $\mathcal{F}^X$-stopping time. I have tried numerous possibilities for $\sigma$ but it does not work.
The general idea however is to calculate $$\mathbb{E}(X_{\tau}|\mathcal{F}_{\sigma}^X)=\sum_{k\in\mathbb{N}} \mathbb{P}(\tau=k)\mathbb{E}(X_{k}|\mathcal{F}_{\sigma}^X)= \sum_{k\in\mathbb{N}} 2^{-k}\mathbb{E}(X_{k}|\mathcal{F}_{\sigma}^X)$$ and then using the fact that $(X_k)_{k\in\mathbb{N}}$ is a martingale so $\mathbb{E}(X_{k}|\mathcal{F}_{s}^X)=X_s$ if $s\le t$. After that I would plug in the definition of $X_k$ and use the fact that $\xi_j=-1$ for all $0\le j\le k-1$ since we assumed $\tau=k$ in that case.
If $\tau(\omega)=k$, then you have lost the first $(k-1)$ rounds and you have won the $k$-th round, i.e.
$$X_{\tau}(\omega)= -\sum_{j=1}^{k-1} 2^{j-1} + 2^{k-1} = (1-2^{k-1}) + 2^{k-1}=1 $$
for any such $\omega$. Since this holds for arbitrary $k \geq 1$, this shows $X_{\tau}=1$ almost surely. In particular, $\mathbb{E}(X_{\tau})=1$.
Now set $\sigma:=1$, then $\sigma$ is a stopping time satisfying $\sigma \leq \tau$ and $\mathbb{E}(X_{\sigma})=0$. In particular, $$\mathbb{E}(X_{\tau}) \neq \mathbb{E}(X_{\sigma})$$ implying $$\mathbb{E}(X_{\tau} \mid \mathcal{F}_{\sigma}) \neq X_{\sigma}.$$