Can someone prove the triangle inequality for the following $d$?

81 Views Asked by At

Consider a function $d:\mathbb{R}^2 \times \mathbb{R}^2\rightarrow \left[ 0 , \infty \right)$ that is defined as following. \begin{equation} \forall x_1,x_2 \in \mathbb{R}^2, \quad d(x_1, x_2)=\min_{0\le \alpha_1 ,\alpha_2 \le 1} {\left( \left(\alpha_1 +\alpha_2\right)s+\left\| x_1'- x_2' \right\|_p \right)} \text{ where } s\ge 0, \, p\ge 1, x_i'=(1-\alpha_i)x_i+\alpha_i x_i^* \text{ and } x_i^*=\begin{bmatrix} 0& 1 \\ 1 & 0 \end{bmatrix} x_i\, (\text{i.e. switching 1st and 2nd elements}) \text{ for } i \in \left\{ 1,2 \right\}. \end{equation}

The idea of $d$ is to define a new (generalized) metric from $\left\| \cdot \right\|_p$ (lp norm) so that it allows shortcuts with penalty parameter $s$.

I tried to prove that $d$ satisfies the triangle inequality (i.e. $d(x_1, x_3)\le d(x_1, x_2)+d(x_2, x_3)$ for $x_1,x_2,x_3 \in \mathbb{R}^2$), but I failed it.

(Some rephrased equations that I found: $x_1'- x_2'=x_1- x_2+ \alpha_1(x_1^*- x_1)-\alpha_2 (x_2^*- x_2)$,

$x_i^*- x_i= \frac{k_i}{\sqrt{2}} \begin{bmatrix} 1 \\ -1 \end{bmatrix}$ where $k_i= (x_i^*- x_i)^T \ \left(\frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ -1 \end{bmatrix}\right)$ for $i \in \left\{ 1,2 \right\}$

I can't sure these equations would be helpful for proof, though. )

I tried to pick random samples using Python and check whether it satisfies the inequality. It seems to satisfy the triangle inequality according to my Python experiments.

Can someone prove the triangle inequality (at least for $p=2$)? Or, can you suggest a counter example?

Thank you.

(PS. I came up with the definition $d$ and tried to find literature about similar or same metrics, but I failed. I appreciate if someone tell some related literature if there is any.)