Let $X$ be a random variable (r.v.) defined on $D\subset{\mathbb{R}}$. Consider $Y(X)$ as a function of $X$, which can be treated as another r.v. Now suppose $$\mathrm{E}(Y^k) = \int_D Y^k(X)dF(x),$$ are known for $k = 0,1, 2\ldots$, where $F$ is the distribution of $X$. Is there any way to uniquely determine what the distribution of $Y$ is? If not, is there any way to recover $Y$ through the expectations of functions over $F$, but not the distribution of $Y$?
Note: $\mathrm{E}(Y^k)$ seems NOT to be the moment of $Y$ since the measure in the integral is with respect to $X$ rather than $Y$. My question is indeed equivalent to ask whether the function $\mathrm{E}(e^{tY}) := \int_D e^{tY} dF(x)$ uniquely determines $Y$.
Let's be a little more precise in our notation. Let $X$ be a random variable taking values in $D\subset\mathbb R$, and let $F$ be the distribution function of $X$. Let $g:D\to\mathbb R$ be some function, and consider the random variable $Y:=g(X)$. Let $G$ be the distribution function of $Y$. A standard fact of probability theory is that
$$ \int y \, dG(y) = \int g(x) \, dF(x).$$
That is, $\mathrm E(Y)=\mathrm E(g(X))$ can be calculated either using the distribution function of $Y$ or the distribution function of $X$. An immediate corollary is that, if $h$ is another function,
$$ \int h(y) \, dG(y) = \int h(g(x)) \, dF(x).$$
This can be seen by letting $Z=h(Y) = (h\circ g)(X)$, letting $H$ be the distribution function of $Z$, and applying our original result first with $Z$ as a function of $Y$ and then with $Z$ as a function of $X$. If we apply this with $h(y)=y^k$, we see that either side is simply the $k^\text{th}$ moment of $Y$. (Hence the notation $\mathrm E(Y^k)$.) Thus, your question is reduced to the following: do the moments of a random variable uniquely determine its distribution? If $Y$ takes values in a bounded set, then the answer is yes. (If not, the problem is a little more delicate.)