The Expected Value of $g(X)$

86 Views Asked by At

I am studying "A First Course in Probability" by Sheldon Ross, and I have come across a problem with the following proof:

Proposition 4.1

If $X$ is a discrete random variable that takes on one of the values $x_i$, $i \geq 1$, with respective probabilities $p(x_i)$, then for any real-values function g, $$E(g(X)) = \sum_i{g(x_i)p(x_i)}$$

proof

$$\sum_i{g(x_i)p(x_i)}= \sum_j{\sum_{i:g(x_i)=y_j}{g(x_i)p(x_i)}}=\sum_j{\sum_{i:g(x_i)=y_j}{y_jp(x_i)}}=\sum_j{y_j}\sum_{i:g(x_i)=y_j}{p(x_i)}=\sum_j{y_jP(g(X)=y_j)}=E[g(X)]$$

But we know that we cannot interchange the terms of a conditionally convergent series; we might actually change the sum to another number. I could not prove that the series $\sum_i{g(x_i)p(x_i)}$ is absolutely convergent. So, what allows us to interchange the terms of the series in the first step of the proof?

1

There are 1 best solutions below

1
On BEST ANSWER

Correct, in order to appeal to Tonelli or Fubini's theorem to conclude $\sum_{j = 1}^{\infty}\sum_{k = 1}^{\infty}a_{jk} = \sum_{k = 1}^{\infty}\sum_{j = 1}^{\infty}a_{jk}$, you need either that $a_{jk} \in [0, \infty]$ for all $j, k$ or that $\sum_{j = 1}^{\infty}\sum_{k = 1}^{\infty}|a_{jk}|< \infty$. Hence his proof works for nonnegative $g$, and therefore the sum is absolutely convergent if and only if $E(|g(X)|) < \infty$ and the theorem holds for such $g$ too.