Expectation of $\frac{X_1 Y_1 + X_2 Y_2 + ... + X_n Y_n}{X_1 + X_2 + ... + X_n}$ where $Y_1, Y_2, ... Y_n$ are I.I.D. and $X_i = g(Y_i)$

103 Views Asked by At

I'm not sure how to start as expanding it into integrals makes it very messy. I have a feeling that the expectation for $n = 1$ is the same for $n = 100$ and so the expectation is just $E[Y_1]$. Is this true? How can I show this?

2

There are 2 best solutions below

17
On BEST ANSWER

Concretely what you can say is: Assuming $E(Y_i|X_i) = E(Y_j|X_j) $: $$E\left(\frac{\sum_i X_i Y_i}{\sum_i X_i}\right) = E_{X_{1:N}}E_{Y_{1:N}}\left(\frac{\sum_i X_i Y_i}{\sum_i X_i}\ | \ X_{1:N}\right)= E_{X_{1:N}}\left(\frac{\sum_i X_i E(Y_i|X)}{\sum_i X_i}\right) $$ $$= E_{X_{1:N}}\left(\frac{\sum_i X_i E(Y_i|X_i)}{\sum_i X_i}\right) = E_{X_{1:N}}\left( E(Y_j|X_j) \frac{\sum_i X_i}{\sum_i X_i}\right) = E_{X_{1:N}}(E(Y_j|X_j)) = E(Y_j)$$ for any $j$

The condition $E(Y_i|X_i) = E(Y_j|X_j)$ could hold in some cases even if $X = g(Y)$.

For example when $Y$ follows $U[-2,2]$ and

$X = 1$ when $-1 \leq Y \leq 1$ and $X = 0$ otherwise.

In this case $E(Y|X = 0) = E(Y | X = 1) = 0$

As $n \rightarrow \infty$:

By Strong Law of large numbers,

https://en.wikipedia.org/wiki/Slutsky%27s_theorem gives: $$\frac{\sum_i X_i Y_i}{\sum_i X_i} \rightarrow \ distribution \rightarrow \frac{E(XY)}{E(X)}$$

For $L_1$ convergence assuming $B \leq |X_i|,|Y_i| \leq C$ for $B>0$ i.e.,

$$E\left(\frac{\sum_i X_i Y_i}{\sum_i X_i}\right) \rightarrow \frac{E(XY)}{E(X)}$$

see:Convergence in $L^1$ of ratio of random variables

6
On

If $g>0$, $L(s)=E(e^{-sX_1})$ and $L_1(s)=E(X_1Y_1e^{-sX_1})$ then $$E\frac{X_1Y_1+\cdots+X_nY_n}{X_1+\cdots+X_n}=n\int_0^{\infty}L_1(s)L^{n-1}(s)ds.$$