Is it possible to obtain the expected value of a measurable function of $X$, say $g(X)$ by using l-moment statistics ?
The expected value and the first l-moment are indeed equivalent. In particular, the first l-moment can be expressed as $$\lambda_1 = \int_0^1 X(F)\mathrm{d}F,$$ where $X(F)$ is the inverse of the CDF $F$. This measure is equivalent to the expected value $E[X] = \int_{-\infty}^\infty x f(x) \mathrm{d}x$.
I was wondering if such technique enables us to derive expected value of an invertible function $g(X): \mathbb{R} \to [0,1]$, such that $$\tilde{\lambda}_1 = \int_0^1 X(G(X(F))\mathrm{d}F$$ where $X(G)$ is the inverse function of $g(\cdot)$.
Can I claim that $\tilde{\lambda}_1$ is equivalent to $E[g(X)]$?
Any clues and ideas are much appreciated
I'm not an expert in L-moments but maybe the following helps. I denote by $F$ the distribution, by $f$ the density and by $Q$ the quantile function of a random variable $X$. To get to your formula for the first L-moment I use integration by substitution, i.e. \begin{align} \mathbb{E}[X] &= \int_{Q(0)=-\infty}^{Q(1)=\infty} x f(x) dx = \int_0^1 Q(x) f(Q(x)) Q'(x)dx = \int_0^1 Q(x) f(Q(x)) \frac{1}{f(Q(x))}dx\\ &= \int_0^1 Q(x) dx, \end{align} where I use the derivative of the inverse function $Q'(x) = 1 / (f(Q(x)))$. Now concerning a function $g \colon \mathbb{R} \to \mathbb{R}$ we get following the same logic \begin{align} \mathbb{E}[g(X)] &= \int_{Q(0)=-\infty}^{Q(1)=\infty} g(x) f(x) dx = \int_0^1 g(Q(x)) f(Q(x)) Q'(x)dx = \int_0^1 g(Q(x)) f(Q(x)) \frac{1}{f(Q(x))}dx\\ &= \int_0^1 g(Q(x)) dx. \end{align} In this sense there is no need to bring the inverse of $g$ (if it exists) into play. I hope this helps.