I'm not entirely sure what methodology I should use to go about this.
I've got multiple sets of flight data containing latitude, longitude, and altitude at different times for those points. What I want to do is find the average path a given flight takes in that three-dimensional space.
Is there any algorithm that can help with determining this? I'm currently using python for computation, but can try a different language such as R if you happen to know of some existing functionality. Pure math is fine too, of course.
Suppose for altitude, you have a set of data: $A_1(a_1,t_1)...A_n(a_n,t_n)$ where $a_n$ is the altitude at time instant $t_n$.
Since the recording frequency is highly irregular(as you stated in your comment), I suggest you to interpolate the data. Polynomial interpolation, spline interpolation and Gaussian process are all good candidates. In addition, I believe a computer version of these algorithms(in the language you are using) should be very common and open-sourced, as long as you don’t demand too much.
After interpolation, you will have a function $A(t)$. If you want the average of three flight’s altitudes, you can consider: $$A_{average}(t)=\frac{A_A(t)+A_B(t)+A_C(t)}{3}$$
Same for other dimensions.