relation between the moments of distributions

40 Views Asked by At

Assume we have a random variable $X$ with a pdf $\, f_{X}(x)$. Let us define a variable $Y$ $$ Y = g(X), $$ where $g$ is a continuous increasing (or decreasing) function. Is it (or when) possible to express the moments of $Y$ via moments of $X$?

Well, if $g$ is a linear function, then it is obvious. I am asking about more general cases.

1

There are 1 best solutions below

0
On

Yes, if $g(x)$ has a valid Taylor series expansion over the entire domain.

$i$-th non-central moment of $x$ is defined as

$M_i = \int x^i \rho(x) dx$

$i$-th non-central moment of $g(x)$ is defined as

$N_i = \int g(x)^i \rho(x) dx = \int (\sum_k a_k x^k)^i \rho(x) dx = \int Q(x) \rho(x) dx = \sum_j \alpha_j M_j$

where $Q(x) = (\sum_k a_k x^k)^i = \sum_j \alpha_j x^j$ is some infinite polynomial, which is the Taylor series of $g(x)$ raised to the power $i$

This also holds for central moments, as they can be shown to be a linear combination of the non-central moments.

Note of caution: Being able to theoretically express moments through each other does not mean it will be practical. For example, $e^x$ is monotonically increasing and has a Taylor series that converges everywhere. However, should you try to express the moments of $e^x$ through the moments of $x$, you might not be able to truncate your sum for practical applications, as it is possible that all elements of the sum will be significant.