Finite peaks of a function

49 Views Asked by At

Let $N\in\mathbb{N}$, $c_j\in\mathbb{R}$, $x_j\in\mathbb{R}^d$, $j\in\mathbb{N}_N$ be given. Let $K:\mathbb{R}^d\times\mathbb{R}^d\to\mathbb{R}$ be a bivariate function. Consider the function $f:\mathbb{R}^d\to\mathbb{R}$ defined by
$$ f(x):=\sum_{j=1}^N c_j K(x,x_j), \quad x\in\mathbb{R}^d. $$ I am interested in the case of $K$ being Gaussian function $$ K(x,y)=\exp(-\frac{\|x-y\|_2^2}{2\sigma^2}), $$ or Laplacian function $$ K(x,y)=\exp(-\frac{\|x-y\|_2}{\sigma}). $$ So my question is whether $f$ has finite peaks. If so, how to strictly prove it and is the number of peaks at most $N$?

For Gaussian function case, I am thinking to take the derivative and count the zeros of the derivative. The peaks can only happen at which first derivative equal to $0$. But how to know the number of zeros of its derivative?

1

There are 1 best solutions below

0
On

Yeah, solving for where the derivative (which is no longer one-dimensional) is zero will solve for the local extrema, and then seeing which ones are maxima will do. This definitely has finitely many peaks. Finding how many and where these peaks are though seems difficult. It looks like this needs a lot of vector calculus. Something like this can rearrange this into a root-finding problem:

\begin{align} \frac{\partial}{\partial x} f(x) &= \frac{\partial}{\partial x} \sum_{j=1}^N c_j K(x,x_j)\\ &= \sum_{j=1}^N c_j \frac{\partial}{\partial x} K(x,x_j)\\ &= \sum_{j=1}^N c_j \frac{\partial}{\partial x} \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \\ &= \sum_{j=1}^N c_j \Bigg( -\frac{1}{2\sigma^2} \frac{\partial}{\partial x} || x - x_j ||_2^2 \Bigg) \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \\ &= \sum_{j=1}^N c_j \Bigg( -\frac{1}{2\sigma^2} 2 || x - x_j ||_2 \frac{\partial}{\partial x} || x - x_j ||_2 \Bigg) \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \\ &= \sum_{j=1}^N c_j \Bigg( -\frac{1}{\sigma^2} || x - x_j ||_2 \frac{x - x_j}{|| x - x_j ||_2} \Bigg) \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \\ &= \sum_{j=1}^N c_j \Bigg( -\frac{x - x_j}{\sigma^2} \Bigg) \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \\ &= \sum_{j=1}^N c_j \frac{x_j - x}{\sigma^2} \exp \Big(-\frac{1}{2\sigma^2} || x - x_j ||_2^2 \Big) \end{align}

Finding the zeroes of this (and then confirming which are a maximum) does not seem that easy though, analytically! Numerically, there are algorithms that, once supplied $\sigma$ and $x_j$, can solve for the roots. The vector derivative I used above came from the Matrix Cookbook, btw (snippet below):

enter image description here