How are satellites accurate with their distances?

59 Views Asked by At

Not sure if this is the right forum, but I know this can be explained with some math that I haven't been able to find. Thanks.

With the formula $d=cr$ we can get the distance between two points, $r$ being the time it takes for a signal to reach B from point A, and $c$ being the speed of light.

Let's assume we're talking in milliseconds, and meters per second (the speed of light is 299 792 458 m/s). With a number that large, a slight delay in the response time of a satellite can be a large distance off. If a satellite was even 1ms off with its response, that would translate to almost 300 kilometers (299.792458 km)! So what I'm wondering, is how satellites can register the ping-pong action, and actually send out a response fast enough to be accurate? Thanks.

2

There are 2 best solutions below

0
On BEST ANSWER

The main point is that you can calculate the travel time and response delays and allow for them. The orbits of the satellites are well known, in fact one of the purposes of them exchanging signals is to measure the orbit. The delays through the satellite are also known, having been measured over temperature as part of ground testing. Finally if there is drift of the delay in one satellite, you have redundancy in all the links between the satellites. It will become clear from the error analysis which the bad actor is and what new value should be used to account for it.

0
On

All satellites are synchronized, while the receiver is not. But the shift remains constant during the measurement.

The satellites emit messages tagged with the transmission time, and the receiver can compute a "time of flight", which includes this shift.

If the receiver collects the positions and times of flight of four satellites, this is enough to solve a system of equation in four unkowns, namely the distances to the satellites and the time shift.