Learning Conditional expectation: Understanding a simple example

51 Views Asked by At

I am beginning to try and teach myself stochastic calcul from here: https://services.math.duke.edu/~agazzi/notes_MAR31st.pdf

TLDR: Why in the final equality are the $1$ and $3$ squared?

I found the following example from some lecture notes:

Consider a random sequence $\omega = \{\omega_{i}\}_{i=0}^{N}$ where $\omega_{i} = 1$ with $Prob(\frac{1}{2})$ and $-1$ with $Prob(\frac{1}{2})$ and are independent.

Let $\Omega=\{-1,1\}^{N}$ i.e. the set of all sequences of length $N$ made from $1$ and $-1$

consider the sequence of functions $X_{n}: \Omega \rightarrow \mathbb{Z}$ where $X_{0}(\omega) = 0, X_{n}(\omega) = \sum_{i=1}^{n}\omega_{i}$

Calculate $$\begin{align*} \mathbb{E}[(X_{3})^{2}|X_{2}=2] &= \sum_{i \in \mathbb{N}}i \mathbb{P}[(X_{3})^{2} = i|X_{2}=2] \\ &= (1)^{2}\mathbb{P}[X_{3}=1|X_{2}=2] + (3)^{2}\mathbb{P}[X_{3}=3|X_{2}=2]\\ &=5 \end{align*}$$

I have managed to get the same answer as the notes, but I do not understand why there is a squared weighting outside each sum in the final line of each calculation?

2

There are 2 best solutions below

0
On BEST ANSWER

It follows from the law of the unconscious statistician:

$$\mathbb{E}[(X_{3})^{2}|X_{2}=2] = \sum_{i \in \mathbb{N}}\color{red}{i^2} \mathbb{P}[X_{3} = i|X_{2}=2]$$

0
On

have managed to get the same answer as the notes, but I do not understand why there is a squared weighting outside each sum in the final line of each calculation?

The missing step is that the only values of $(X_3)^2$ giving non-zero (conditional)probability are $1$ and $9$, so, the values of $X_3$ that do so are...

$$\begin{align}\mathbb{E}[(X_{3})^{2}|X_{2}=2] ~&=~ \sum_{i \in \mathbb{N}}i \mathbb{P}[(X_{3})^{2}= i\mid X_{2}=2]\\[1ex]&\color{blue}{=~ (1)^{2}\mathbb{P}[(X_{3})^2=(1)^2\mid X_{2}=2] + (3)^{2}\mathbb{P}[(X_{3})^2=(3)^{2}\mid X_{2}=2]}\\[1ex]&=~ (1)^{2}\mathbb{P}[X_{3}=1\mid X_{2}=2] + (3)^{2}\mathbb{P}[X_{3}=3\mid X_{2}=2]\\[1ex]&=5\end{align}$$