So, I come from a programming background, but not a mathematical one. I have a problem that I'm sure has a great mathematical solution!
so, I have GPS data points for every second of a drive a car makes. Due to accuracy issues, if I capture this GPS data from different devices, the points are slightly different, having a bit of noise.
I use this data to work out speed between points and work out acceleration/breaking from this. The problem is, if the points are slightly different between devices, the events captured can be hugely different and I need consistency between devices.
Is there any kind of smoothing etc anyone can suggest that could help me with this?
Many thanks
Depends on the amount of processing time you have.
orocos BFL, is a c++ library, which I found really helpful and has nice architecture, and is used in embedded applications.
IMHO, stay away from Bayes++; in my application it outperformed (speed) the BFL, but has one of the nastiest software architectures I have ever seen, where every variable and function is called
forf().!Matlab has its own Kalman Filter libraries.
I have seen some papers that use online particle filters, but they must have went into alot of trouble to multithread the algorithm and have some really badass hardware.
Also, you could use DGPS, but given that you are finding acceleration from position .i.e. finding second derivative numerically, I don't expect that this will be enough though
EDIT: As I understand from the comments, this is offline processing. I would suggest particle filters in MATLAB. Check this out