Minimal eigenvalue of gram matrix generated by squared-exponential kernel

131 Views Asked by At

I have a question about the bound of the minimal eigenvalue of the gram matrix generated by the squared-exponential (SE) kernel.

Given a positive integer $n$ and $N$. Define $k: \mathbb{R}^n \times \mathbb{R}^n \rightarrow \mathbb{R}$ as $k(x,y) = \exp\left(- {\frac{1}{2}\|x-y\|^2} \right)$.

Define an $N \times N$ matrix $K$, the function of $x_1, \cdots, x_N \in \mathbb{R}^n$, as

$K = K(x_1, \cdots, x_N) = \begin{bmatrix} k(x_1, x_1) & \cdots & k(x_1, x_N) \\ \vdots & \ddots & \vdots \\ k(x_N, x_1) & \cdots & k(x_N, x_N) \end{bmatrix} $

and denote the minimal eigenvalue of $K$ as $\lambda_{\min}(K)$.

If we assume there exists $d>0$ such that $\|x_i - x_j\| \ge d$ for all $i \neq j$ and $i,j = 1,\cdots, N$, is there a lower bound of $\lambda_{\min}(K)$ as a funtion of $d$?

Thank you for your attention.