Let $\Bbb{S}_{++}^n$ be the space of $n\times n$ symmetric positive definite matrices. We define the function $d\colon(\Bbb{R}^n\times\Bbb{S}_{++}^n)\times(\Bbb{R}^n\times\Bbb{S}_{++}^n)\to\Bbb{R}$ as follows $$ d\big((\mathbf{x},A_x),(\mathbf{y},A_y)\big)=\frac{1}{2}(\mathbf{x}-\mathbf{y})^\top\big(A_x+A_y\big)^{-1}(\mathbf{x}-\mathbf{y}), $$ where $\mathbf{x},\mathbf{y}\in\Bbb{R}^n$, and $A_x,A_y\in\Bbb{S}_{++}^n$.
We would like to prove or disprove that this function is a metric on $\Bbb{R}^n\times\Bbb{S}_{++}^n$. For all $(\mathbf{x},A_x),(\mathbf{y},A_y),(\mathbf{z},A_z)\in\Bbb{R}^n\times\Bbb{S}_{++}^n$, we require the following to hold true:
- $d\big((\mathbf{x},A_x),(\mathbf{y},A_y)\big)\geq0$ (non-negativity, or separation axiom), which holds as $(A_x+A_y)^{-1}$ is still a postive-definite matrix (please correct me if I am wrong),
- $d\big((\mathbf{x},A_x),(\mathbf{y},A_y)\big)=0$ iff $(\mathbf{x},A_x)=(\mathbf{y},A_y)$ (identity of indiscernibles, or coincidence axiom), which is not true, I think, as it suffices to be $\mathbf{x}=\mathbf{y}$ (not necessarily $A_x=A_y$; what does it actually means?),
- $d\big((\mathbf{x},A_x),(\mathbf{y},A_y)\big)=d\big((\mathbf{y},A_y),(\mathbf{x},A_x)\big)$ (symmetry), which holds trivially, and
- $d\big((\mathbf{x},A_x),(\mathbf{z},A_z)\big) \leq d\big((\mathbf{x},A_x),(\mathbf{y},A_y)\big) + d\big((\mathbf{y},A_y),(\mathbf{z},A_z)\big)$ (subadditivity / triangle inequality), which is not triavial, but does it hold? It seems to ressemble the Mahalanobis distance, but could we prove (or disprove) that it holds true?
To sum up, I would like to ask what is true about this function? Is it a metric on $\Bbb{R}^n\times\Bbb{S}_{++}^n$? If so, what about the comment in the coincidence axiom, and what about the triangle inequality? Does it hold? Thank you very much for your help!
EDIT
As can be seen in the accepted answer (as well as in the comments), this function cannot be a metric. Is there any ideas on what would be an appropriate metric (at least having the above properties!), if the $n$-dimensional vectors are the mean vectors of some multivariate Gaussian distributions, and the SPD matrices are the corresponding covariance matrices? I thought of the (symmetrized) Kullback–Leibler divergence, but it seems that it's not a metric either ([4] does not hold, as @tdc shows in his/her answer). Any ideas? Thanks again!
As was mentioned in the comments:
Take $n=0, n=1$, etc. This does not satisfy the third axiom of a metric that $d(x,y)=0$ iff $x=y$. Thus, it cannot possibly qualify as a metric in the standard sense.
That is, $d((x,A),(x,B))=0$.
When this axiom is not satisfied, we may call it a psuedo-metric.