Assume that I have N Gaussian functions with different means $\mu_i$ and variances $\beta_i$, How to prove $e^{-\beta_i(x-u_i)^2}$ are linear independent? 1$\le$i$\le$N
2026-04-06 04:55:52.1775451352
On
How to prove the gaussian functions are linear independent?
594 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Both Gaussian and multiquadric functions are (strictly) positive definite (Wendland's "Scattered data approximation", pp.74,76). If there were a non-trivial linear combination identically zero for all $x$ then it would be zero for any $x=x_j$ too and you could write a quadratic form $\sum\sum\alpha_j\bar{\alpha_k}\Phi(x_j-x_k)=0$ which would contradict the strict positive definiteness of $\Phi$.
I expect that the linear independence can be proved via asymptotic arguments by the following plan. Suppose that there exist non-zero constants $\lambda_1,\dots,\lambda_N$ such that $$f(x)=\sum_{i=1}^N \lambda_i e^{-\beta_i(x-u_i)^2}=0.$$ Let $\beta =\min\{\beta_1,\beta_2,\beta_N\}$ and $$g(x)=\sum_{\beta_i=\beta} \lambda_i e^{-\beta_i(x-u_i)^2}.$$
Considering the asymptotics of the function $e^{\beta x^2}f(x)=e^{\beta x^2}(g(x)+o(g(x))$ when $x\to\infty$, we see that $e^{\beta x^2}g(x)=0$. Next, $$e^{\beta x^2}g(x)=\sum_{\beta_i=\beta} \lambda_i e^{-\beta u_i^2} e^{2\beta_i u_i x}.$$ So, again considering the asymptotics of the function $e^{\beta x^2}g(x)$ when $x\to\infty$, we see that all $\lambda_i e^{-\beta_i u_i^2}$ are zeroes.
PS. I expect that the linear independence of the family of functions $\{e^{2\beta_i u_i x}\}\equiv\{e^{\mu_1 x}, e^{\mu_2 x},\dots, e^{\mu_k x}\}$ can also be proved using the Vandermonde determinant
$$\left|\begin{array}{} e^{\mu_1 0} & e^{\mu_2 0} & \dots e^{\mu_k 0}\\ e^{\mu_1 1} & e^{\mu_2 1} & \dots e^{\mu_k 1}\\ \dots & \dots & \dots \\ e^{\mu_1 (k-1)} & e^{\mu_2 (k-1)} & \dots e^{\mu_k (k-1)}\\ \end{array} \right| =\prod_{1\le i<j\le k} (e^{\mu_i}- e^{\mu_j})\ne 0.$$