Inequality involving kernel matrix and quadratic form

193 Views Asked by At

Given data points $x_1,x_2,\dots,x_n\in \mathbb{R}^n$ and kernel $k(.,.): \mathbb{R}^n\times\mathbb{R}^n \to [0,1]$ which satisfies Mercer's theorem, construct the kernel matrix $A$ as $A_{ij} = k(x_i,x_j)$. Also apply the kernel to the data point $x_{n+1}$ and previous data points $x_1,x_2,\dots,x_n$, yielding the vector $b$ as $b_i = k(x_{n+1} ,x_i)$. Now consider the following quadratic form: $$q=b^TA^{-1}b$$ I want to find an upper bound on $q$. By simulation, I've found that $$q\le b^Tb\frac{1}{k(x,x)}$$ holds for many data points and kernels but I couldn't show that formally. Is it possible to prove that inequality? Note that it's assumed that the kernel is stationary and monotone which means $k(x,y)$ is a decreasing function of distance between $x$ and $y$: $$k(x,y) = f(\|x-y\|)$$

1

There are 1 best solutions below

6
On

Since $k(x,y)=f(\|x-y\|)$ and since $A=f(\|x_i-x_j\|)_{1\leq i,j\leq n}$ and $A_1=f(\|x_i-x_j\|)_{1\leq i,j\leq n+1}$ are positive definite, their determinants are positive. Therefore by the Schur decomposition $$A_1=\left[\begin{array}{cc}A&b\\b^T&f(0)\end{array}\right]=\left[\begin{array}{cc}I_n&0\\b^TA^{-1}&1\end{array}\right]\left[\begin{array}{cc}A&0\\0&f(0)-q\end{array}\right]\left[\begin{array}{cc}I_n&A^{-1 }b\\0&1\end{array}\right]$$ we claim that $\det A_1=\det A(f(0)-q),$ which implies $q\leq f(0).$