Say we have two continuous independent random variables $X$ and $Y$. I understand that
$E[XY] = E[X]E[Y]$
but what about $E[X/Y]$. Can I say:
$E[Xg(Y)]$ where $g(x) = \dfrac{1}{x}$
$E[X]E[g(Y)]$ by linearity
Then solve for each? If not then how does one go about solving this type of problem?
Let us analyze this through the lens of the product distribution. First let there be a random variable $Z = {1 \over Y}$. Since $Y$ is independent of $X$, $Z$ is also independent of $X$.
For two independent random variables, the expectation of their product is the product of their expectations. This can be proved through the Law of Total Expectation.
$$\mathbb E(XZ) = \mathbb E_Z\mathbb E_{XZ | Z}(XZ | Z))$$ $$\text{and given } Z \text{ is a constant in } XZ \text{ of inner expression...}$$ $$=\mathbb E_Z(Z \cdot \mathbb E_{X | Z}(X))$$ $$\text{and due to independence...}$$ $$=\mathbb E_X(X) \cdot \mathbb E_Z(Z)$$
To come back to your original problem...
$$\mathbb E\left({X \over Y}\right)$$ $$= \mathbb E\left(X \cdot {1 \over Y}\right)$$
$$=\mathbb E(X \cdot Z)$$ $$=\mathbb E(X) \cdot \mathbb E(Z)$$ $$=\mathbb E(X) \cdot \mathbb E\left({1 \over Y}\right)$$
However, it is very rarely the case that $\mathbb E\left({1 \over Y}\right) = {1 \over \mathbb E(Y)}$
Jensen's inequality states that if $U$ is a random variable and $\varphi$ is a convex function, then $\varphi(\mathbb {E}(U)) \leq \mathbb E(\varphi(U))$.
If $U$ is strictly positive, then $1 \over U$ is convex. Then ${1 \over \mathbb {E}(U)} = \mathbb E\left({1 \over U}\right)$ only if $U$ has zero variance.