I have a matrix $\Sigma$ that I use as a covariance matrix for simulations purposes. Its terms are defined as in the following: $\Sigma_{ij} = \exp(-d_{ij}/d_0)$, where the $d_{ij}=\sqrt{(x_i-x_j)^2+(y_i-y_j)^2}$ are the distances between a set of points $\left\{(x_i,y_i)\right\}_{i\in[[1,n]]}$, and $d_0>0$ is a caracteristic distance (the correlation exponentially descreases with the distance between two points). After performing a few simulations it seems that $\Sigma$ is indeed a covariance matrix, whatever the value for $d_0$ with respect to the $d_{ij}$, as long as $d_0\in \mathbb{R}^+$.
I would like however to prove it. $\Sigma$ is obviously real & symmetric and so I tried to show it is also semi-definite positive (even positive?) by having a look at its eigenvalues, without succes (using Gerschgorin’s theorem)...
Any help/ideas towards that goal or counter-example is welcome!