So this is a problem I am trying to solve to try and find objects in images. A given image is scanned and the coordinates of detected features are returned. With these coordinates I want to assign a value to each pixel based on how "close" it is to each feature through a normal distribution. So for a particular pixel (x,y) this would be the value generated:
$ \huge P(x,y) = \frac{1}{2\pi \sigma_{x}\sigma_{y}}\sum\limits_{i = 0}^{detfeatures} e^{-\frac{1}{2}((\frac{x-\mu_{xi}}{\sigma_{x}})+(\frac{y-\mu_{yi}}{\sigma_{y}}))}$
In this $det features$ is simply the number of features detected and the coordinates of each feature is represented by the $\mu $ with the subscripts on it indicating if we are looking at the x or y coordinate and also of which feature e.g. the x coordinate of feature 3 would be given by $\mu_{x3}$. The $\sigma$ indicates standard deviation and all of this information will be computed at run time.
This will generate a 3d surface (or a scalar field if you will) which represents the likelihood that an object is detected here, and the problem is this:
Is there any way to find the local maxima of the function using this formulam, rather than setting a threshold for detection and increasing it i.e. checking which points lie above a certain z value. Solutions involving summations are welcome.
I am sorry if this is a simple question as I have not yet taken multivariate calculus.
Also here is an example of a possible surface generated:![Here is an example of a surface that could be generated, here there are 2 local maxima I am interested in https://i.stack.imgur.com/ovemv.png