You are playing a fair betting game. i.e. Every round you win 1 dollar with probability 0.5 or lose 1 dollar with probability 0.5. You play a total of $T$ rounds in a game. Suppose in hindsight, you always left the game when you had the best possible earnings for that game, then what are your expected earnings?
The answer to this is O($\sqrt{T}$). I still can't figure out why. Any suggestions?
This is ultimately a question about random walks, and this problem is very closely related to Wald's Second Identity; Wald's identity states that if $\{X_i\}$ are iid with $\mathbb{E} X_i = 0$ and $\text{Var}(X_i) = 1$, then for any stopping time $T$, we have $$ \mathbb{E} (S_T^2) = \mathbb{E}(T),$$ where $S_k = \sum_{i = 1}^k X_i$. In your case, the $X_i$'s are Rademacher random variables, i.e. $\mathbb{P}(X_i = 1) = \mathbb{P}(X_i = -1) = \frac{1}{2},$ which indeed have mean $0$ and variance $1$.