Is the expected value of the difference of these two random variables, with infinite expected value, $0$, or undefined?

217 Views Asked by At

Let's say we have two independent random variables, $x_1$ and $x_2$, both have a probability mass function $X$ defined as

$$X(n) = \begin{cases} 2^{-m} & \text{if $n=2^m$ for $1 \le m \in \mathbb Z$} \\ 0 & \text{otherwise} \end{cases}$$

So $X(2)=\frac 12$, $X(4) = \frac 14$, $X(8) = \frac 18$, and so on (this is the distribution used in the St. Petersburg paradox).

The expected value is undefined for both $x_1$ and $x_2$. My question is, does $E[x_1-x_2]$ exist? I know that if it exists, it must be $0$, by symmetry. I have a feeling it does equal $0$, but I'm not sure.

In general, if two random variables have the same probability distribution, when will the expected value of their difference be $0$? (I think there would be counter examples for this.)

1

There are 1 best solutions below

11
On

In probability theory, we use Lebesgue integration to define the expected value. In this sense your expected value is not defined, even as $\pm \infty$, because it is a $\infty - \infty$ form. That is, in Lebesgue integration you define $\int f = \int f^+ - \int f^-$, where $f^+$ is the positive part and $f^-$ is the negative part. Here both are $\infty$, so you get $\infty - \infty$ which we decline to define.

You could define integration through a notion like the Cauchy principal value; in this case the "symmetry argument" you mentioned would say that you do indeed get zero. This is often a bad idea because if you take the limit with broken symmetry (say summing from $-n$ to $2n$ and then sending $n \to \infty$) then you get a different limit.