Markov inequality in real analysis and in probability

364 Views Asked by At

The following is the definition of Markov inequality in

  1. probability:
    enter image description here

  2. measure theory:
    enter image description here

If I want to relate both, according to the definition of expected value:

$$\mathbb{E}(|f|)=\int_{-\infty}^{\infty} |f|g_X(x)dx$$

where $g_X(x)$ is the probability density function of $X$, can I define
$$d\mu(x)=g_X(x)dx$$ so, I can transfrom the Riemann integral to the Lebesgue integral as following:
$$\mathbb{E}(|f|)=\int_X |f|d\mu(x)$$

So, I construct the equivalence of the definition of Markov inequality of both definition.

Is my derivation correct and valid?

1

There are 1 best solutions below

0
On BEST ANSWER

As GWilliams and PhoemueX said, this is the same inequality in different notation (thus, no proof of equivalence is needed), provided that the probabilistic version is corrected to $$ \mathbb{P}(|X|\ge a)\le \frac{\mathbb{E}(|X|)}{a} $$

Of course, the absolute value may be dropped if $X\ge 0$ a.s.