Proving inequality involving Gaussian function and inverse of gram matrix

116 Views Asked by At

Given $y, x_1, \cdots, x_N \in \mathbb{R}$ and $\lambda > 0$. Define $k: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}^+$ as

$k(x,y) = \exp\left[ - \frac{(x-y)^2}{2} \right]$

$\mathbf{k}: \mathbb{R} \rightarrow \mathbb{R}^N$ as:

$x \mapsto \begin{bmatrix} k(x, x_1) & \cdots & k(x, x_N) \end{bmatrix}^T$ and $K, L \in \mathbb{R}^{N \times N} as$:

$K := \begin{bmatrix} k(x_i , x_j) \end{bmatrix}_{(i,j) = (1,1)}^{(N,N)}$ and $L := \text{diag}(y - x_1, \cdots, y - x_N)$

My conjecture is that a function $h: \mathbb{R} \rightarrow \mathbb{R}$ defined as

$z \mapsto \mathbf{k}^T(y) (K + \lambda I)^{-2} \mathbf{k}(z) \cdot \mathbf{k}^T(y)(K + \lambda I)^{-1} L \mathbf{k}(z)$

is negative for all $z > y$ that satisfies the following equation:

$k(y,z) - \mathbf{k}^T(y) (K + \lambda I)^{-1} \mathbf{k}(z) = 0$ $\cdots$(1)

and positive for all $z< y$ that satisfies equation (1).

The case for $N=1$ is easy and lots of simulation results satisfy this conjecture but I have no idea to prove it.

Thank you for attention.