The following is the definition of Markov inequality in
If I want to relate both, according to the definition of expected value:
$$\mathbb{E}(|f|)=\int_{-\infty}^{\infty} |f|g_X(x)dx$$
where $g_X(x)$ is the probability density function of $X$, can I define
$$d\mu(x)=g_X(x)dx$$ so, I can transfrom the Riemann integral to the Lebesgue integral as following:
$$\mathbb{E}(|f|)=\int_X |f|d\mu(x)$$
So, I construct the equivalence of the definition of Markov inequality of both definition.
Is my derivation correct and valid?


As GWilliams and PhoemueX said, this is the same inequality in different notation (thus, no proof of equivalence is needed), provided that the probabilistic version is corrected to $$ \mathbb{P}(|X|\ge a)\le \frac{\mathbb{E}(|X|)}{a} $$
Of course, the absolute value may be dropped if $X\ge 0$ a.s.