Expected value of a complicated function of more than one random variable.

973 Views Asked by At

Assume we have random variables with Probability Density Functions (pdf) as follows $$\omega_i \sim f_{1},\,\,\,\,\ i \in[1:n]$$ $$ \gamma= \{\gamma_1,\cdots,\gamma_n\} \sim f_2: \text{joint pdf of the set}$$

Now consider $T$, a function of these random variables listed above, as follows

$$T= \sum_{i=1}^n \gamma_i \left(1+ \sum_{j=1, \ j\neq i}^n f(\omega_j-\omega_i)\right)^2$$

where $f(.)$ is a function of the difference of the random variables.

Question:

What is the expected value of $exp (T)$ i.e.

$$\mathbb{E} \biggl[\text{exp}\left(T\right)\biggl]$$

My thoughts:

1- I think we need to first find the distribution of $\omega_j-\omega_i$ by convolution. Let us assume this is possible and the pdf of the difference is $f_D(d)$.

2- Do I then need to find the joint distribution of the functions $ \omega_j-\omega_i$?

3-Moreover I think we have to use the following property,assuming $f_Y(y)$ is the pdf of $Y$

$$E_{X,Y}\big[ g(X, Y)\big]= E_{X} \bigg[E_{Y}\big[g(x,y)\big]\bigg]=E_{X}\bigg[ \int_{y}f_Y(y)g(x,y)dy\bigg] $$

It is the indices in the expression $T$ that are confusing me.

Any advice would much appreciated.

Thanks

1

There are 1 best solutions below

18
On BEST ANSWER

There's a rule which is sometimes called the "law of the unconscious statistician", which says that to get the expectation of a (measurable) function $f$ of a random variable $X$, you don't need to calculate the distribution of the function. Instead you just have, supposing for the moment $X$ is real-valued with density $p$:

$$\int_{-\infty}^\infty f(x) p(x) dx.$$

This is usually easier to calculate than finding the distribution of $f(X)$ itself. In your case, changing notation to match yours, you want

$$\int_{\mathbb{R}^n} \int_{\mathbb{R}^n} \exp(T(\omega,\gamma)) (f_1(\omega))^n f_2(\gamma) d \omega d \gamma.$$