Conditional expectation problem 2

69 Views Asked by At

Let $X_{1},...,X_{n}$ be i.i.d. random variables such that $E|X_{i}|< \infty$ for $i=1,..,n$. Consider the sigma-algebra $ \mathcal{G}=\sigma(X_{1}+...+X_{n})$. I have to show that $E(X_{i}|\mathcal{G})=E(X_{1}|\mathcal{G})$ for $i=1,..n$ . I suspect that I should use the definition of the conditional expectation and take advantage of the sigma-algebra stucture, but I can not figure out the way. Any help would be appreciated. Thanks in advance.

2

There are 2 best solutions below

3
On BEST ANSWER

By definition of Conditional Expectation, $Y=E(X_{1}|\mathcal{G})$ if the following conditions holds:

  1. $Y$ is $\mathcal{G}$-measurable,
  2. $E\left[Y\cdot\mathbf{1}_{G}\right]=E\left[X_1\cdot\mathbf{1}_{G}\right]$ for any $G\in \mathcal G$, where $\mathbf{1}_{G}$ is indicator function.

Let us check whether $Y=E(X_{1}|\mathcal{G})$ coincide with $E(X_{i}|\mathcal{G})$. We need to check conditions similar to 1,2.

  1. $Y$ is $\mathcal{G}$-measurable,
  2. $E\left[Y\cdot\mathbf{1}_{G}\right]=E\left[X_i\cdot\mathbf{1}_{G}\right]$ for any $G\in \mathcal G$.

The first condition is satisfied automatically. To check 2nd, one need to prove only that $E\left[X_i\cdot\mathbf{1}_{G}\right]=E\left[X_1\cdot\mathbf{1}_{G}\right]$ for any $G\in \mathcal G$.

Remind that any set $G\in\mathcal G$ has the form $G=\{X_1+\ldots+X_n\in B\}$ for some Borel $B$.

Since $X_1,\ldots,X_n$ are i.i.d., distributions of random vectors $$(X_1,\ldots,X_i,\ldots,X_n) \text{ and } (X_i,\ldots,X_1,\ldots,X_n)$$ are the same (both are products of the same marginal distributions). And the distributions of any measurable function of the first vector is the same as the distribution of this function of the second vector. Therefore $$X_i\cdot\mathbf{1}_{\{X_1+\ldots+X_n\in B\}}\overset{d}{=}X_1\cdot\mathbf{1}_{\{X_1+\ldots+X_n\in B\}}$$ for any $B\in\mathfrak B(\mathbb R)$ and their expectations coincide: $$E\left[X_i\cdot\mathbf{1}_{G}\right]\equiv E\left[X_1\cdot\mathbf{1}_{G}\right] $$

We check that $Y=E(X_{1}|\mathcal{G})$ satisfies the definition of $E(X_{i}|\mathcal{G})$. Then $$Y=E(X_{1}|\mathcal{G})=E(X_{i}|\mathcal{G}) \text{ a.s.}$$

6
On

A physicist would use an argument by symmetry -- all $X_i$ are iid, and you are conditioning on the symmetric events, so how could you possibly tell any difference between any of the individual random variables? Not only expected values, but variances, etc etc would be the same.