A question about a form of Markov ineqauality

19 Views Asked by At

I am studying Levy processes, and met the argument that if $X_t$ is a.s. cadlag then $X_t$ is continuous in probability.

The proof goes by showing that $$\lim_{u\to t} P(|X_u - X_t|>\epsilon)=\lim_{|t-u|\to 0}P(|X_{|t-u|}|>\epsilon)\le \lim_{h \to 0}\frac{1}{\epsilon} E(|X_h|\wedge \epsilon). $$

The final inequality is foreign to me because from what I've seen it should be $E(|X_h|)$, why do we get the minimum of with $\epsilon$?

2

There are 2 best solutions below

2
On BEST ANSWER

Letting $h=t-u$ we have $P(|X_{|t-u|}|>\epsilon) \le P(|X_{|t-u|}|\wedge \epsilon'>\epsilon) \le \frac{1}{\epsilon} E(|X_h|\wedge \epsilon')$ whenver $\epsilon' >\epsilon$. Letting $\epsilon' \to \epsilon$ and invoking DCT we get $P(|X_{|t-u|}|>\epsilon) \le \frac{1}{\epsilon} E(|X_h|\wedge \epsilon)$.

0
On

This is classical Markov inequality:

$\int XdP=\int_{X>\epsilon}XdP+\int_{X\leq\epsilon}XdP\geq\epsilon\int_{X>\epsilon}dP=\epsilon P(X>\epsilon)$

You realize you squized $X$ to $\epsilon$, so you can sharpen the inequality beforehand:

$\int \min \{X,\epsilon \}dP=\int_{X>\epsilon}\epsilon dP+\int_{X\leq\epsilon}XdP\geq\epsilon P(X>\epsilon)$