Question about expectation notation

85 Views Asked by At

$E[X]=\sum_x{x \space f_x(x)}$ for the discrete case.

So let's say I want to find $E[XY]$,

By the definition that would be $E[XY]=\sum_{(x,y)}xy \space f_{xy}(x,y)$ where $f_{xy}(x,y)$ is the joint pdf of X and Y (correct me if I'm wrong).

Now here comes the part that I don't understand, suppose I have $E[X^2Y]=\sum_{(x,y)}x^2y \space f_{x^2y}(x,y)$ (by def.) or some other "fancy" expectation how do I get the probability distribution?

In the $E[XY]$ case, $\space f_{x^2y}(x,y)=f_{xy}(x,y)$ right? (if $x$ is always positive)

When can these simplifications occur?

Thank you very much!

1

There are 1 best solutions below

0
On

We attempt an answer, for the discrete case of the post.

It is probably not useful to define $E(XY)$ as you did. Note that $XY$ is a random variable. Call it $W$. Then, by the definition of expectation, $$E(W)=\sum w\Pr(W=w),\tag{1}$$ where the sum is taken over all possible values of $W$. It happens to be a not very difficult to prove theorem that $$E(W)=\sum xy f_{X,Y}(x,y),$$ where $f_{X,Y}(x,y)$ is the joint distribution function of $X$ and $Y$.

Similarly, $X^2Y$ is a random variable $Z$, and $E(Z)$ is defined in the same way as in (1). It is a theorem that $$E(Z)=\sum x^2y f_{X,Y}(x,y).$$ More generally, for any function $g$, $$E(g(X,Y))=\sum g(x,y)f_{X,Y}(x,y).$$