I have read that the $n$th moment of a random variable is defined as
$$\mathbb(E)[X^n] = \int_{-\infty}^\infty x^n f_X(x) \, dx$$
where
$$f_X(x) \sim \mathbb(P)[X = x]$$
My question is why does the preceding line hold true instead of
$$f_X(x^n) \sim \mathbb(P)[X = x^n].$$
In addition, is it possible to find the $n$th moment using the tailsum formula for the first moment?
$$\mathbb(E)[X] = \int_{-\infty}^\infty 1 - F_X(x) \, dx$$
where
$$F_X(x) = \int f_X(x) \, dx$$
A probability density function $f_X$ of a random variable $X$ is not characterized by $f_X(x) = \Pr(X=x),$ but by $$ \Pr(X\in A) = \int_A f_X(x)\,dx $$ for every measurable set $A.$ The same notation, $f_X,$ is used for probability mass functions of discrete random variables, for which we have $\Pr(X=x) = f_X(x)$ for every $x$ in the domain of $f_X.$
The expected value of a random variable with a density function is given by $$ \operatorname E(X) = \int_{-\infty}^\infty x f_X(x)\,dx. $$ Now suppose $g$ is some function for which $\operatorname E(g(X))$ exists. Then we have \begin{align} & \operatorname E(g(X)) \\[10pt] = {} & \int_{-\infty}^\infty x f_{g(X)}(x) \, dx \tag 1 \\[10pt] = {} & \int_{-\infty}^\infty g(x) f_X(x)\, dx. \tag 2 \end{align} Equality the integrals in lines $(1)$ and $(2)$ is sometimes called the "law of the unconscious statistician." When I finished my Ph.D. in statistics I had never heard that term although I was well aware of the equality.
For example, $$ \operatorname E(X^2) = \int_{-\infty}^\infty x f_{X^2}(x)\,dx = \int_{-\infty}^\infty x^2 f_X(x)\, dx. $$
The reason for using the form on line $(2)$ above is that it is usually simpler.
Possibly someone has posted a request for a proof here, so you might find it there.