Linear independence of functions from positive definite kernel

288 Views Asked by At

Let $S$ be a finite set and $M_p(S)\subset \mathbb{R}^{|S|}$ the set of probabilities on $S$.

Define the positive-definite kernel: $$k:M_p(S)\times M_p(S)\rightarrow \mathbb{R},\,\qquad(\nu,\mu)\mapsto k(\nu,\mu):=\left(\sum_{s\in S}\nu(s)\mu(t)\right)^2=(\nu^T\mu)^2,$$ wherein the final equality $\nu\,,\mu$ are considered vectors in $\mathbb{R}^{|S|}$.

This positive-definite kernel induces maps $M_p(S)\rightarrow \mathbb{R}$, one for each $\nu\in M_p(S)$:

$$K_\nu:M_p(S)\rightarrow \mathbb{R},\,\qquad \mu\mapsto k(\nu,\mu).$$

Are these maps linearly independent?

Without the square, $$K_\nu=\sum_{s\in S}\nu(s)K_{\delta^s},$$ and there is linear dependence, but I am looking for linear independence.


Some thoughts. Consider a finite set $\{\nu_1,\nu_2,\dots,\nu_m\}\subset M_p(S)$ and complex numbers $(\alpha_i)_{i=1}^m$ such that: $$\sum_{i=1}^m\alpha_iK_{\nu_i}=0.$$ For each $s\in S$, we have: \begin{align*} \sum_{i=1}^m\alpha_i K_{\nu_i}(\delta^s) =0 \Rightarrow \sum_{i=1}^m \alpha_i \nu_i(s)^2 =0 \end{align*} Consider, for a pair $s_a,s_b\in S$: \begin{align*} \sum_{i=1}^m\alpha_i K_{\nu_i}\left(\frac12\delta^{s_a}+\frac12 \delta^{s_b}\right) =0 \Rightarrow \sum_{i=1}^m \alpha_i\nu_i(s_a)\nu_i(s_b) & =0. \end{align*} Summing over $s_a\in S$: $$\sum_{i=1}^m\alpha_i \nu_i(s_b)=0.$$ Considering where $u_S$ is the uniform distrition $$\sum_{i=1}^m\alpha_i K_{\nu_i}\left(u_S\right)\Rightarrow \sum_{i=1}^m \alpha_i=0.$$


As per the comments, it is sufficient to show that for any finite subset $\{\nu_1,\dots,\nu_m\}\subset M_p(S)$, that the Gram matrix $G\in M_m(\mathbb{R})$:

$$G_{ij}=\langle\nu_i,\nu_j\rangle^2=(\nu_i^T\nu_j)^2$$

is (strictly) positive definite.


The bounty winner's answer gives for $\nu_1=(1,0)$, $\nu_2=(0,1)$, $\nu_3=(1/2,1/2)$, and $\nu=(1/3,2/3)$:

$$K_\nu=-\frac19 K_{\nu_1}+\frac{2}{9}K_{\nu_2}+\frac49 K_{\nu_3}.$$

1

There are 1 best solutions below

4
On BEST ANSWER

To show that the Gram matrix is positive definite, it is enough to note that it is the Hadamard product of the Gram matrix corresponding to the usual inner product with itself (which is itself positive definite) and then to use the Schur product theorem .

Edit: If you begin with $v_i$ which are not linearly independent, there is no reason why the Gram matrix corresponding to $K$ should be non-singular as it is the Gram matrix of $e_i\otimes e_i$. For instance, take $e_1=(1,0),e_2=(0,1)$ and the set $\{e_1,e_2,\frac{1}{2}e_1+\frac{1}{2}e_2,\frac{1}{3}e_1+\frac{2}{3}e_2\}$. The Gram matrix for the inner product is $$G=\begin{pmatrix}1&0&\frac{1}{2}&\frac{1}{3}\\ 0&1&\frac{1}{2}&\frac{2}{3}\\ \frac{1}{2}&\frac{1}{2}&\frac{1}{2}&\frac{1}{2}\\ \frac{1}{3}&\frac{2}{3}&\frac{1}{2}&\frac{5}{9} \end{pmatrix}$$ so that $G\circ G$, the Gram matrix for our kernel is $$\begin{pmatrix}1&0&\frac{1}{4}&\frac{1}{9}\\ 0&1&\frac{1}{4}&\frac{4}{9}\\ \frac{1}{4}&\frac{1}{4}&\frac{1}{4}&\frac{1}{4}\\ \frac{1}{9}&\frac{4}{9}&\frac{1}{4}&\frac{25}{81} \end{pmatrix}$$ which has determinant $0$.