Two ways of calculating the expectation of a random variable.

71 Views Asked by At

I'm working with Loeve's book on Probability theory. At some point (probability distributions chapter) he proves that the expectation of a random variable $E[g(X)]$ can be calculated using two different methods: $E[g(X)] = \int_{R} g\,dP_X$ or $E[g(X)]=\int_{\Omega}g(X)\,dP$.

My question is very simple. Given that $P[X=1]=\lambda$ and $P[X=0]=1-\lambda$. What is the difference between the two methods?

For the second one I know it would work as: $E[g(X)]=\int_{\Omega} g(X)\,dP=\int_{\Omega}X\,dP=\int_{[X=1]}1\,dP+\int_{[X=0]} 0\,dP = \lambda$ What about the first one? $Eg(X)=\int_{R}x\,dP_X$ ...but I'm not sure how to continue...

1

There are 1 best solutions below

2
On BEST ANSWER

You can see the difference only if you imagine the probability space $(\Omega,\mathscr A, P)$ and take a set $A\in\mathscr A$ and its complement $A^c$. Let $P(A)=\lambda$ and $P(A^c)=1-\lambda$.

Let then $(R,\mathscr L)$ the measurable space formed by the real numbers and, say, the Lebesgue measurable sets. Also, let $X:\Omega\to R$ be a random variable, a measurable function such that $X(\omega)=1$ if $\omega\in A$ and $0$ otherwise. $X$ is measurable because

$$\{\omega: X(\omega)<x\}= \begin{cases} \emptyset\,\,\,\in \mathscr A,& \text{ if } x\le0\\ A^c\in \mathscr A,& \text{ if } 0< x\le1\\ \Omega\,\,\in \mathscr A,& \text{ if } 1< x. \end{cases}$$ Also

$$\{\omega: X(\omega)=1\}=A\,\text{ and }\, \{\omega: X(\omega)=1\}=A^c$$

or in other words

$$X^{-1}(\{1\})=A\,\text{ and }\,X^{-1}(\{0\})=A^c.$$ Accordingly, the distribution of $X$, usually denoted by $P_X$, is defined as follows for any $L\in\mathscr L$

$$P_X(L)= \begin{cases} 0,&\text{ if } 0\not \in L \text{ and } 1\not \in L\\ \lambda,&\text{ if } 0\not \in L \text{ and } 1 \in L\\ 1-\lambda,&\text{ if } 0 \in L \text{ and } 1 \not \in L\\ 1,&\text{ if } 0 \in L \text{ and } 1 \in L\\ \end{cases}$$ or $$P_X(L)= \begin{cases} 0,&\text{ if } L=\{0,1\}^c\\ \lambda,&\text{ if } L=\{1\}\\ 1-\lambda,&\text{ if } L=\{0\}\\ 1,&\text{ if } 0 \in L=\{0,1\}.\\ \end{cases}$$

Now, $(R,\mathscr L, P_X)$ is a probability space.

The expectation of $X$ can be calculated two ways; as an integral over $(\Omega,\mathscr A, P)$ and as an integral over $(R,\mathscr L, P_X)$:

  1. $E[X]=\int_{\Omega}X(\omega)dP=1\cdot P(A)+0\cdot P(A^c)=\lambda.$

  2. $E[X]=\int_Rx\ dP_X=1\cdot P(\{1\})+0\cdot P(\{0\})=\lambda.$

I used above the function $g(x)=x$. Everything is the same if $g$ is more complicated. The nature of $g$ is irrelevant ($g$ has to be Lebesgue measurable, though.) regarding the two different ways of calculating the expectation.

P.S.

$E[g(X)]=\int_{\Omega} g(X)\,dP\not =\int_{\Omega}X\,dP$ unless $g(x)=x$.