So if we have the formulas $$Eg(X) = \int \cdots \int g(x_1,\dots,x_k)f_X(x_1,\dots,x_k) \, dx_1\dots dx_k$$ or, for discrete, $$Eg(X) = \sum_{x\in M} g(x)f_X(x)$$ with $X:\Omega\to\mathbb{R}^k$ a continuous (discrete in second case) random vector with pdf (pmf) $f_X:\mathbb{R}^k\to\mathbb{R}$ and $g$ any function $g:\mathbb{R}^k\to\mathbb{R}$ such that $Eg(X)$ exists
Then, say X is the exp($\lambda)$ distribution, $$E(X) = \int_0^\infty x \cdot \frac{1}{\lambda}e^{-\frac{x}{\lambda}}dx$$
Where does that $x$ at the start of the integral come from -- the rest comes from the density function, of course. Does the x come from the function $g$, somehow? (Similarly the $x$ is there in the discrete case, if that is easier to explain).
If necessary, perhaps use the definitions $$E(X) =\int X \, dP,$$ with $P$ is a measure.
EDIT: $E(X)$ doesn't just mean $E(g(X))$ with $g(x) = x$, does it...?
Recall the definition of expected value for a discrete random variable $X$ whose range is $0, 1, 2, ...$:
$$E(X) = 0P(X = 0) + 1P(X = 1) + 2P(X = 2) + ...$$
Let us write this compactly:
$$E(X) = \sum_{k=0}^{\infty} kP(X = k)$$
Or if you like:
$$E(X) = \sum_{x=0}^{\infty} xP(X = x) \ (*)$$
$x$ and $k$ are just dummies. You can use $\xi$ if you like, but I think most prefer to use the lower case of the random variable if the random variable is a capital letter.
Similarly, for all well-behaved $g$,
$$E(g(X)) = \sum_{x=0}^{\infty} g(x)P(X = x) \ (**)$$
If we have $g(x) = x$, then $(**)$ becomes $(*)$