Minkowski inequality for distances of random variables

196 Views Asked by At

I do not understand why Minkowski's inequality guarantee the triangle inequality for the next metric in a random variable set, with second moment finite:

$d(X,Y)= \left \| X-Y \right \|_p$,

where:

$\left \| X \right \|_p = (E\left | X \right |^p)^\frac{1}{p}$

Minkowski's inequality is:

$\left \| X+Y \right \|_p$ $\leq$ $\left \| X \right \|_p$ + $\left \| Y \right \|_p$

1

There are 1 best solutions below

2
On BEST ANSWER

$$d(X,Z)=||X-Z||_p=||(X-Y)+(Y-Z)||_p \leq ||X-Y||_p+||Y-Z||_p=d(X,Y)+d(Y,Z).$$

Thus, you get the triangle inequality: $d(X,Z)\leq d(X,Y)+d(Y,Z).$