I am beginning to try and teach myself stochastic calcul from here: https://services.math.duke.edu/~agazzi/notes_MAR31st.pdf
TLDR: Why in the final equality are the $1$ and $3$ squared?
I found the following example from some lecture notes:
Consider a random sequence $\omega = \{\omega_{i}\}_{i=0}^{N}$ where $\omega_{i} = 1$ with $Prob(\frac{1}{2})$ and $-1$ with $Prob(\frac{1}{2})$ and are independent.
Let $\Omega=\{-1,1\}^{N}$ i.e. the set of all sequences of length $N$ made from $1$ and $-1$
consider the sequence of functions $X_{n}: \Omega \rightarrow \mathbb{Z}$ where $X_{0}(\omega) = 0, X_{n}(\omega) = \sum_{i=1}^{n}\omega_{i}$
Calculate $$\begin{align*} \mathbb{E}[(X_{3})^{2}|X_{2}=2] &= \sum_{i \in \mathbb{N}}i \mathbb{P}[(X_{3})^{2} = i|X_{2}=2] \\ &= (1)^{2}\mathbb{P}[X_{3}=1|X_{2}=2] + (3)^{2}\mathbb{P}[X_{3}=3|X_{2}=2]\\ &=5 \end{align*}$$
I have managed to get the same answer as the notes, but I do not understand why there is a squared weighting outside each sum in the final line of each calculation?
It follows from the law of the unconscious statistician:
$$\mathbb{E}[(X_{3})^{2}|X_{2}=2] = \sum_{i \in \mathbb{N}}\color{red}{i^2} \mathbb{P}[X_{3} = i|X_{2}=2]$$