Proving short cut for expected value of random variable

217 Views Asked by At

In the case where you have some Random Variable Y=g(X) and you know the pdf of X.

In general the expected value of Y is ... image

However since Y is a function of X and we know the pdf of X we can say,

image

so clearly, these two methods provide the same result, but can someone help me prove that the second formula is actual equal to E[Y]...? image

My apologies for the lack of effort on this post, I only have a few more hours until my exam and I am trying to make sense of this. All help is greatly appreciated!

1

There are 1 best solutions below

0
On

You cannot just compare the density function within different integrations...

There are two ways we can compute the expectation of $Y=g(X)$, which are

$$E(Y))=\int_{y \in R(Y)} y f_y(y) dy$$

$$E(g(X)))=\int_{x\in R(X)} g(x) f_x(x) dx$$

Notice that the intergration is over different event space. When you want to compare the density function within integrations, you need to change the event space $y\in R(Y)$ and $dy$.

Notice that $R(Y) = \{y:y=g(x), x \in R(X)\}$ and $\frac{dy}{dx}=g'(x)$, $x=g^{-1}(y)$, and given given the pdf of X, $f_x(x)$, we can rewrite the second integration

$$E(g(X)))=\int_{y=g(x), x \in R(X)} g(x) f_x(g^{-1}(y)) \frac{1}{g'(x)} dy$$

Now we have two integration over same sample space, and you can compare the density function, which is $$ f_y(y) = f_x(g^{-1}(y)) \frac{1}{g'(x)} $$

Done.