Let $X,Y:(\Omega,\mathcal F)\rightarrow(\mathbb R,\mathcal B)$ be measurable, i.e. random variables, and $P$ a probability measure on $(\Omega,\mathcal F)$. What is the precise definition of $E[Z]$, where $E$ denotes expectation and $Z = X\cdot Y$.
In my undergraduate course we calculated $E[Z]$ by $$E[Z] = E[XY] = \int\int xyf(x,y)\ \mathrm dxd\mathrm y$$ (assuming for a second that exists the density $f$). I don't see how we get the double integral. I guess $Z$ is defined on $(\Omega\times\Omega,\mathcal F\otimes\mathcal F)$? But what is the corresponding probability measure? How do we know that a measure even exists?
Clearly, if $X$ and $Y$ are independent, then the product measure $P\otimes P$ is the corresponding measure and we get $$E[Z] = \int Z\ \mathrm dP\otimes P = \int\int X\cdot Y\ \mathrm dP\mathrm dP$$ by Fubini's theorem.
No, $Z$ is also a function $(\Omega, \mathcal F) \to (\mathbb R, \mathcal B)$. Specifically $$ Z(\omega) = X(\omega) Y(\omega). $$ Expectation is calculated the same as for $X$ or $Y$, i.e. $$ E[Z] = \int_{\Omega} Z(\omega) \text{d}P(\omega) =\int_{\Omega} X(\omega) Y(\omega) \text{d} P(\omega). $$ You can also evaluate things in terms of a measure on $\mathbb R^2$, which is what you are thinking of. Rather than all the information $P$ provides, we just need the probability measure $\mu_{X, Y}$ on $\mathbb R^2$ that $X$ and $Y$ induces. Then $$ E[Z] = \int_{\mathbb R^2} xy \, \text{d}\mu_{X, Y}(x, y). $$ If $X$ and $Y$ are independent then $\mu_{X, Y} = \mu_X \otimes \mu_Y$ where $\mu_X$ and $\mu_Y$ are the probability measures induced on $\mathbb R$ by $X$ and $Y$ respectively (you may need to prove this, it can also be taken as a definition of independence). This allows us to split $\mu_{X, Y}$ (not $P$). We then have $$ E[Z] = \int_{\mathbb R} \int_{\mathbb R} x y \, \text{d}\mu_X(x) \, \text{d}\mu_Y(y) = \int_{\mathbb R} x \, \text{d}\mu_X(x) \, \int_{\mathbb R} y \, \text{d}\mu_Y(y) = E[X] E[Y]. $$