Expected value definitions

304 Views Asked by At

Let $X : \Omega \to \mathbb{R}$ be a discrete random variable in a discrete probability space with countable sample space $\Omega$. Let $P(\omega)$ be the probability of an outcome $\omega \in \Omega$, and $X(\omega)$ the assignment of outcome $w$ to a real number. There are 2 definitions for the expectation of $X$.

$$E(X) = \sum_{\omega \in \Omega} P(\omega)X(\omega)$$

and

$$E'(X) = \sum_{u \in \text{range}(X)} \sum_{X(\omega) = u} P(\omega)X(\omega)$$

The former is just an unordered sum over a countable set $\Omega$ and the latter is a double sum over countable subsets of $\Omega$. The expected value only exists if the unordered sum converges. How can we show that the two definitions are equivalent? That is, $E(X)$ exists if and only if $E'(X)$ exists (and both converge to the same value). I am using the following definition of unordered sum:

We say that $\sum_{\omega \in S} f(\omega) = T$ (converges to $T$) if for each $\varepsilon > 0$, there exists a finite subset $F \subseteq S$ such that for all finite $G$ satisfying $F \subseteq G \subseteq S$ we have $$\left|\sum_{\omega \in G} f(\omega) - T\right| < \varepsilon.$$

This question shows an attempt to prove a more general result about splitting an unordered sum. We can prove that $E(X)$ exists implies $E'(X)$ exists. But for the other direction, a nice counterexample was given. Thus, is there something special about the expectation function that makes the definitions equivalent?

2

There are 2 best solutions below

1
On BEST ANSWER

The point is that all summands in the inner sum of $E'$ have the smae sign.

Suppose $E'$ exists. Given $\epsilon>0$ find a finite subset of the range that takes you within $\epsilon/4$ of the limit, then within each of the ($n$, say) used summands pick a finite subset that takes you within $\epsilon/4n$ of the limit of that summand. We obtain a finite subset that takes us within $\epsilon/2$ of the limit; adding more summands from "within" (the picked finite subset of the range) cannot takes us further away than another $\epsilon/4$; the same holds for adding summands from "without". All in all we stay withon $\epsilon$ of the limit.

0
On

$$\mathcal{P}:=\left\{ X^{-1}\left(\left\{ u\right\} \right)\mid u\in\text{range}X\right\}$$ is a partition of $\Omega$ so that: $$\sum_{\omega\in\Omega}P\left(\omega\right)X\left(\omega\right)=\sum_{u\in\text{range}X}\sum_{\omega\in X^{-1}\left(\left\{ u\right\} \right)}P\left(\omega\right)X\left(\omega\right)=\sum_{u\in\text{range}X}\sum_{X\left(\omega\right)=u}P\left(\omega\right)X\left(\omega\right)$$

Also note that:

$$\sum_{u\in\text{range}X}\sum_{X\left(\omega\right)=u}P\left(\omega\right)X\left(\omega\right)=\sum_{u\in\text{range}X}\sum_{X\left(\omega\right)=u}P\left(\omega\right)u=$$$$\sum_{u\in\text{range}X}u\sum_{X\left(\omega\right)=u}P\left(\omega\right)=\sum_{u\in\text{range}X}uP\left\{ X=u\right\}$$


edit:

Setting $\Omega_{+}=\left\{ \omega\mid X\left(\omega\right)>0\right\} $ and $\Omega_{-}=\left\{ \omega\mid X\left(\omega\right)>0\right\} $ the sum $\sum_{\omega\in\Omega}P\left(\omega\right)X\left(\omega\right)$ is well defined as expectation if and only $\sum_{\omega\in\Omega_{+}}P\left(\omega\right)X\left(\omega\right)<\infty$ or $\sum_{\omega\in\Omega_{-}}P\left(\omega\right)X\left(\omega\right)<\infty$

Note that the sets $\Omega_{+}$ and $\Omega_{-}$ can both be written as a union of elements of $\mathcal{P}$.

So the condition can be translated into $\sum_{u\in R_{+}}\sum_{\omega\in X^{-1}\left(\left\{ u\right\} \right)}P\left(\omega\right)X\left(\omega\right)<\infty$ or $\sum_{u\in R_{-}}\sum_{\omega\in X^{-1}\left(\left\{ u\right\} \right)}P\left(\omega\right)X\left(\omega\right)$ where $R_{+}=\left(0,\infty\right)\cap\text{range}X$ and $R_{-}=\left(-\infty,0\right)\cap\text{range}X$.

These are the conditions for the sum $\sum_{u\in\text{range}X}\sum_{\omega\in X^{-1}\left(\left\{ u\right\} \right)}P\left(\omega\right)X\left(\omega\right)$ to be well defined as expectation of $X$.