I am developing a mobile game app that needs to find a point on a map based on a set of observed values. The app allows users to touch points on a map to the closest proximity of where they think an event happened. The algorithm needs to determine the most likely point where the event could have happened based on those observations. Observations happen for x mins, then stop of x mins. This cycle continues. Each cycle delivers between 20 to 40 observations.
I need to be able to display a single point based on observations as the observations happen.
If you assume that each user's guess is unbiased (i.e., all users are not consistently off by some amount) then you will want to set the most likely location to one of two values:
To update, just keep re-calculating with more and more data as it comes in.