Multi-sensor fusion with full data sets

169 Views Asked by At

I have noisy position tracks of a moving object from multiple sensors, and I want to fuse these tracks to come up with a single "best" estimate of the object's position. Unlike in most applications, I don't need to fuse the tracks in real-time; instead I'm fusing the data after the entire track has been collected from each sensor.

I do have covariances for the tracks from each sensor, and using a simple multi-sensor Kalman filter has yielded decent results, i.e. the error between the fused track and the true trajectory is smaller than the error between any individual sensor's track and the true trajectory. However, since I have all the data at hand and therefore can look into the future as I compute each state estimate, I think I should be able to come up with a result that further minimizes the error. Are there any other algorithms that I could try that leverages the fact that I have the full data set?

Some additional info:

  • The samples from each sensor are not synchronized in time. Right now I'm using a simple linear interpolation to sync the times.
  • Most of the tracks overlap in time, but some may not. For example, sensor 1 may have a track from time 1 to 5 seconds, and sensor 2 may have a track from 6 to 10 seconds.
  • I'm currently using a 6-state Kalman filter (3D Cartesian coordinates), but the object can accelerate in any direction.

Many thanks in advance!