I’m trying to implement a Time Delay of Arrival Multilateration algorithm. However, I’m struggling to understand the concept on the level necessary to produce an actual implementation.
For reference, my problem is as follows:
Given the location of three receivers, and the time at which each receiver “saw” some signal, determine the location of the source. We can assume the source and the receivers are in a plane, and that the curvature of the earth is negligible. We also know the velocity of the signal.
My understanding of the subject is that this somehow involves fitting three hyperbola and determining their intersection (see Measurement Geometry (Wikipedia) and Solution Algorithms (Wikipedia)
In practice, how would this work? To what equation do I fit the hyperbola? In what way would I perform the fitting (will least squares regression work here?). Overall, why does this work?