Denote the joint distribution of random variables $X_1, X_2, ..., X_n$ by $p(\mathbf{x})=p(x_1, x_2, ..., x_n)$, and marginal distribution of $X_i$ by $p(x_i)$.
Can I write $\mathbb{E}_{p(\mathbf{x})}[X_i] = \mathbb{E}_{p(x_i)}[X_i] $? That is, dropping the irrelevant $X_2,...X_n$ in the original joint distribution $p(x_1, x_2, ..., x_n)$ over which the expectation is taken?
My rationale is (for the sake of argument let $i=1$): $$\mathbb{E}_{p(\mathbf{x})}[X_1] =\int... \int x_1 p(x_1,...,x_n) dx_1 dx_2...dx_n \\ =\int (\int... \int x_1 p(x_1,...,x_n) dx_2...dx_n) dx_1\\ =\int x_1 (\int... \int p(x_1,...,x_n) dx_2...dx_n) dx_1 \\ =\int x_1 p(x_1) dx_1 = \mathbb{E}_{p(x_i)}[X_1]$$
where the second equality (switching the order of integration) assumes Fubini's theorem applies (say I'm working with a multivariate Gaussian distribution).
Is this correct?