Let $X,Y$ are two random variables which are not necessarily independent. It is easy to get $\mathbb{E}(X)$ ann $\mathbb{E}(Y)$. I want to know: is there some approximation to $\mathbb{E}(\frac{X}{Y})$?
[Update] The background is that I want to calculate the expectation of Pearson product-moment correlation coefficient - $\mathbb{E}(\rho_{xy})$.
The Pearson product-moment correlation coefficient is $\rho_{xy} = \frac{Cov_{xy}}{\sigma_x\sigma_y}$
It is easily to get $\mathbb{E}(Cov_{xy})$ and $\mathbb{E}(\sigma_x\sigma_y)$, so I want to know a approximation to get $\mathbb{E}(\rho_{xy})$ from the above two value.
If $X$ and $Y$ have a joint density $f(x,y)$, $E[X/Y] = \int_{-\infty}^\infty \int_{-\infty}^\infty \frac{x}{y} \ f(x,y)\ dx\ dy$ (assuming that converges absolutely). Similarly, if they have a joint probability mass function $p(x,y)$, $E[X/Y] = \sum_x \sum_y \frac{x}{y} p(x,y)$.
Let $\mu = E[Y]$. If you can treat $Y - \mu$ as small compared to $\mu$ (in particular if there is $c>0$ such that $c < Y < 2 \mu - c$ almost surely) then $$E[X/Y] = \sum_{j=0}^\infty (-1)^j \frac{E[X (Y - \mu)^j]}{ \mu^{j+1}}$$ so you could use a partial sum of that as an approximation.